Webinar by Chris Sherman explains some basic SEO principles
I checked out an interesting Webinar hosted by Searchengineland’s Chris Sherman in which he looked at how to manage SEO for big sites and big brands. The focus was on the management process which I am going to write about in the future, and the techinal side of SEO where he raised some interesting points on search engine optimisation for big sites. So let’s have a look at those.
I think the main take home message was one we really believe in, that is to first and foremost think of the searcher or user. Anything you do to your site should be with them in mind ahead of any SEO considerations. And I say this as someone who’s focus is primarily SEO!
So for example Chris raised the issue of Brand over Content. When a searcher is looking online they are trying to find a solution to a particular question, and Brand is less of a consideration than getting the right result. So in your copy you should be looking for ways to validate the search results.
When the search engine displays a descripiton this will usually be from your meta descripiton tag or the content on the page – both of which you have control over. So how do you get the searcher to click through onto your page? Well, a recent study by Enquiro showed that rather than focussing on Brands when looking at the search engine results users were more likley to seek validation using semantic tracking methods, that is looking for ‘related’ terms in the listing. So something you have full control over and can experiment with.
Another area that was consdidered was the fact that many large sites use dynamic or content management systems to publish content, all perfectly sensible. But this can, and does, cause issues for the search engines, usally too many parameters will cause the search engines to run away and not index a site! There are solutions and work arounds and while the search engines are getting better at dealing with multiple parameters they are by no means perfect.
One idea Chris mentioned was using your internal search tool to check and see what results it returned, with the focus being on relevancy and not just ‘filler’ material. Again, think first of the searcher. What you may be able to do is to use your site search tool as a dynamic page generator if the search engines are having a hard time getting hold of your page URL’s. Worth thinking about.
A quick poll of the Webinar showed that about 1/3 of particpents had a site with 10,000 pages or more. Chris remarked that this was up markedly from a previous poll along the same lines. We are definately seeing a general increase in site size; whether it is through an incease in product offering , User Generated Content or more information being made available by companies in response to user needs.
Either way, big sites are likely to get bigger and the means to manage them may cause issues for indexing and ranking on the search engines. If you are just starting out on the process of finding a new CMS solution then one of your first questions may be ‘is this a search engine friendly system?’ Always ask for live examples and use the site:www.site.com command to see how many pages are indexed and how well they are ranked.
There are ways to manage large sites but it is easier to do so up front rather than trying to get changes made to fully implemented systems. Trust us, we know!