Organic Search 2012: The Story So Far

We’re only 4 months into the year and in terms of organic search, it’s been eventful to say the least! Google updates its search algorithm hundreds of times each year, but it’s only on occasion that these updates have a significant effect on search results, to the extent that we as SEOs have to act fast in re-thinking our strategies and approaches.

This post gives a brief overview of what I consider to be the most noteworthy updates of the year so far.

A Bit of Background

Google is on a constant mission to bring back the most relevant set of results to any search query – they work continually to make our journey as users as quick and efficient as possible. Their ever-evolving search algorithm aims to bring all their perceived signals of relevancy together, prioritise them and apply them accordingly to the billions of web pages in their index, resulting in a ranked set of results.

Our job as SEOs is to recognise those criteria and ensure that our clients’ websites and marketing strategies will pass on the right signals to the search engines so that they are able to rank as well as possible for their desired keywords.

Unfortunately (although not surprisingly given the huge potential in a page 1 position for certain keywords), there are a handful of people that will use their knowledge of search engine ranking criteria to manipulate their rankings in a manner that contravenes the search engines’ search quality guidelines, favouring these methods over user experience. Search engines don’t want ‘spam’ outranking web pages that offer true value to the user, so there’s an ongoing game of ‘cat and mouse’ between the search engines and the spammers.

Page Layout Algorithm Improvement (Jan ’12)

Once it was evident that content was one of the criteria that Google classes as an important ranking factor, the challenge for SEOs became:

“How can I get tons of content on this page without it detracting from the pretty pictures?”

There have been many techniques applied to achieve this goal (some more dubious than others); Including thousands of words in a <div> aligned so far to the left/right that it’s no longer visible on screen, using white text on a white background, displaying a couple of lines of text and then hiding the rest of your content in an expandable “Read More” box… These techniques may satisfy the search engines’ need for content, but for the user it’s not particularly useful.

There have been various updates over the past few years to detect and punish some of the more dodgy ‘cloaking’ techniques, but in January of this year Google made an announcement that they had improved upon their ‘Page Layout Algorithm’ to benefit those sites that were displaying their content ‘above the fold’ rather than hiding it at the bottom of the page or below a slew of ads – although this was said to affect only 1% of searches globally, it has certainly given designers something to think about when planning new sites, and those sites attempting to disguise their content may well have seen reductions in rankings.

Google ‘Venice’ Update (Feb ’12)

Within Google’s monthly “Search Quality Highlights” blog for February, it was announced that an algorithm update with the codename “Venice” was introduced to improve rankings for local search results. Essentially what this means is that for certain generic keywords Google will now take into account browser location (which is usually auto-detected), and shoehorn in certain local results that wouldn’t have otherwise ranked for that term.

There’s certainly logic in applying this update to certain keywords – if I search for “curry house” and I live in Colchester, I want to find a curry house in Colchester… not a curry house in Bristol that just happens to have spent more time/money optimising their site for the term “curry house”.

What I’m slightly dubious of is the way in which Google determines whether a keyword is deserving of local results – if someone searches for “SEO” for example, they will see local results that would not normally have been there, and I’m not sure the intent behind the search requires these bumped rankings.

“Google Webmaster Tools Notice of Detected Unnatural Links” (Mar ’12)

For years it has been understood that backlinks are a strong indicator to search engines of a web page’s authority and relevance, and as such the quantity and quality of links pointing to a site has a direct impact on rankings. The theory is that if people are linking to it then it’s probably relevant and worthy of reading, and the more people that link to it and the more authoritative those linking sites are perceived to be, the higher it should rank.

Spammers have used hundreds of methods of building links in bulk to improve their rankings, and there are even pieces of software that will go out and mass-comment on blogs or mass-submit your site to directories. These, along with paid links, go against Google’s search quality guidelines and are generally considered to be ‘black hat’ SEO techniques.

One of the most powerful ways of leading Google to believe that your site is particularly relevant to any particular keyword is to post content containing a link back to your site, using the keyword as the anchor text (the clickable text). This spawned the generation of blog networks such as ‘BuildMyRank’ – a service whereby spammers can pay for access to a large number of interlinked blogs and post poorly-spun content containing links back to their sites using their chosen anchor text.

In March, Matt Cutts made mention of an imminent update to tackle “Over-Optimised” sites, prompting speculation as to what constituted over-optimisation. Shortly after this announcement, certain blog networks (including BuildMyRank) were removed from Google’s index, and webmasters started receiving messages in their Google Webmaster Tools accounts notifying them of “Unnatural Links”. These events led to supposition across the SEO industry that Google was finally taking notice of the obvious violations of their policies, and actively penalising those that were seen to be gaming their way to rankings.

It is estimated that there have been over 700,000 of these messages sent out so far with reports of drops in rankings following these messages. There still is however, a huge number of sites with dubious backlink profiles still dominating the SERPs.

The ‘Penguin’ Update (Apr ’12)

In another move to penalise sites that use spammy link building techniques, on the 24th April Matt Cutts announced (via both Google’s Inside Search and Webmaster Central blogs) yet another algorithm update targeting “WebSpam”. He gave an on-site example in which the website owner had stuffed keywords on the page in an incomprehensible manner, and an example of a poorly spun guest post (most likely taken from a blog network site) – indicating that both techniques would be picked up by this new update and penalised.

One day after the announcement Google named the update ‘Penguin’, and following criticism from webmasters claiming to be innocent yet having seen drops in their rankings, they have now put out a feedback form for people to fill in if they feel they have been wrongly penalised, or if they wish to report anything that wasn’t picked up by the update.

What Will The Rest of 2012 Hold?

It’s clear that Google are coming down hard on spammers and trying to reward those sites that have stayed true to best practices. I imagine there will be a series of updates to ‘Penguin’ over the coming months to reduce the number of innocent websites hit and to improve on its ability to pick up on the most severe violations of Google’s guidelines.

As for ‘Venice’, I think there’s some tweaking needed both with the choice of search terms that are deemed to have local intent, and to avoid spammy “Directions to…” pages popping up, created purely to manipulate a company’s local rankings.

There has also been a lot of talk about ‘Negative SEO’ in recent weeks following a forum post claiming to have successfully caused a couple of websites to be penalised through the use of ‘black hat’ SEO techniques. Google had previously led us to believe that it would be near enough impossible to have your site harmed by a competitor, so this is pretty big news – they have not yet officially commented on the subject, but I would think there will be something within the next couple of months to respond to this.

Cue the ‘Big Update’ for May ’12…

 

Exit mobile version