2 March 2015 | Team Tamar

4 algorithmic changes Google should implement

Google’s algorithm is a very complex combination of on-site and off-site parameters. As SEOs, we’re fascinated with how Google determines ranking, and every tiny change or perceived change get a huge amount of coverage within the industry.

While Google has come a long way in improving user experience, tackling spam and low quality sites from appearing high up in search results, there remains a lot of areas where it can be perfected. Here are 4 algorithmic changes we think the search engine should implement.

Good links/bad links

penguin-seo-bad-links

Links is a bit of a grey area of SEO. While everyone loves a good buzz phrase about how content’s everything, most SEOs accept that links still matter a lot and are necessary evil. Getting links for your website is great, even more so if the websites are high authorities within their field.

In recent years, Penguin updates have helped reduced low quality links. However, Google’s missing a trick here, as it fails to actually understand the meaning behind links.

For instance, someone could link to your site because your platform or services are rubbish, ironically making you a big favour from an SEO point of view. In fact, Google should start understanding how positive or negative the content behind links is, and take this into consideration along other link quality factors.

Behaviour data

Another thing Google should look at is user behaviour. Google Analytics is a fantastic tool to understand how people interact with your site. How long they stay on, how many pages they visit, demographic information- it’s all in there. And it should be taken into consideration.

Simply put, if Google’s aim is to provide the best user experience, they just cannot ignore user experience data from ranking factors. For example, a very high bounce rate should be good indication that although a page might be well optimised, its content isn’t popular among users, perhaps even irrelevant to their queries.

Fact checker

You can find just about any piece of information online, and its contrary. A good addition to assessing content quality would be to introduce some sort of fact checker. Search results should be accurate and informative, and like a good piece of journalism be grounded in research.

Hence, sites which are factually mistaken should not appear ahead of those who did their homework. This would be pretty tricky to implement but hey, Google never fails to impress us!

Social signals

social-buttons-overlap

Finally, social signals is a big debate point within the industry. Some SEOs see a Facebook like or tweet as a clear indication of content quality and popularity, which shouldn’t be ignored by search engines.

The other side of the coin is that social signals are easy to manipulate, and can easily be bought for SEO purposes.Having said that, there are plenty of ways Google could use to identify whether social accounts are real or spammy.

This would to an extent tackle the issue of spammy social signals. With social media is growing at an exponential rate every year, there’s only so long search can ignore it.

 

What would YOU introduce to Google’s algorithm? Share your thoughts in the comments.

Team Tamar