1

I am in the middle of a project for a client (University) and was wondering if anyone can shed some light on an search engine optimization (SEO) and information (IA) issue.

At the moment they have hundreds of courses - often with similar titles and content - structured in a monstrous way, with a convoluted IA. Unnecessary pages, subjects, sub-subjects, sub-sub-subjects etc all to accommodate the number of course pages they have.

I am proposing to apply e-commerce style filter navigation for finding their courses and pool all courses under the same URL, in the same folder rather than fragmenting everything under multiple pages.

Their concern is that flattening the IA and moving everything up will affect their SEO in a negative manner.

Does any one know if that is correct?

Stephen Ostermiller
  • 98,758
  • 18
  • 137
  • 361
  • Just for the purpose of background information, here are two answers you should check out. The first talks about URL and URI structure regarding SEO http://webmasters.stackexchange.com/questions/74633/well-structured-urls-vs-urls-optimized-for-seo/74639#74639 and the second talks about the effects of semantics regarding SEO http://webmasters.stackexchange.com/questions/81551/why-would-a-website-with-keyword-stuffing-rank-higher-than-one-without-in-google/81552#81552 Both will help you understand a bit of what is going on and help you with structure and ranking. – closetnoc Jun 16 '15 at 15:49

1 Answers1

1

If the information is there, it will work either way. You should prefer the human useable way though. Google can use JS now among other things, so as long as you define URi query parameters in Google webmaster tools you should be ok on a single page. It can use it like a normal human would. Compensation for missing urls comes via new filter parameters....essentially the same thing (a page for each result set).

This is the perfect world though; the real world of SEO is different. You may see things that you thought died in the 90's. Why? because too many webmasters don't know what they are doing when it comes to sync-N-link to create an accessible realm for the bots to effortlessly crawl. Google MUST cater to these sites, even if they are totally junk. Even if you follow every single Google suggestion, you will see crappy sites missing conceptual [SEO] assets ranking right alongside your epic/perfect app. You may pull your hair out and wonder "why didn't I just leave it like it was? The spammy duplicate entity url method seems to rank so well...these long tail buried urls are injecting keywords so well..."

In this situation, remember the human usability. Slowly but surely your modernized and streamlined method of filtering will become the preferred choice in SERP hierarchy.

dhaupin
  • 3,339
  • 13
  • 31