Google seem to have shot themselves in the foot a bit with their link based ranking search algorithm.
In it’s day it was the best by such a huge margin that it shot Google to the top of the pile and made it firmly the worlds Number One search engine.
In fact I doubt that any other company in the world has such a big market share of their particular business as Google does.
The simple premise was that other websites would “link” to any website that it thought was good. So each link was a “vote” or a “like” for that website. The more votes or likes it had the more popular it was and so the higher up the search results it should be.
It had one major flaw though. The nearer the top of the search rankings a website gets the more people will see it. The more people that see it the more likely they are to link to it.
This means that established sites get an advantage over new sites as new sites will have no links and therefore will not rank highly in the search engines.
Now Google has done some work to help alleviate this like giving new sites a temporary higher ranking for additional exposure. However as webmasters became more savvy it was obvious that links were the way to get top rankings. The start of modern SEO was born and suddenly Google’s algorithm was becoming ineffective as 10,000’s of webmasters were able rank any webpage for any keyword just by firing links at it. The natural selection process that Google envisaged was already drowning.
So move a few years down the line and Google are wishing they could rank websites by the quality and relevance of the content (back around in one big circle!).
So Panda was born and that brings us basically up to date.
Over the last 12 months we have seen sweeping changes in Google’s approach to ranking websites.
Panda has put the focus firmly on higher quality content.
Regular Panda updates are still ongoing and have caused a lot of sites to drop dramatically in the search rankings. Most of you will already heard of Panda but I will go into it a little deeper in a minute just to be sure you are clear on how it affects your website.
Panda may be “old news” but I need to talk about it to put the other changes into context.
For instance did you know that there was Panda update in March and they will be ongoing for some time?
OK, so Google is sending a message – We want unique, quality, useful and relevant content on the websites.
Does that mean backlinking is dead?
Not by a long shot! Technology does not allow machines to judge the quality of written content well enough to work as the major factor in search rankings. In fact try searching Google and you will be surprised at how little information there is on the subject. The info that exists is mainly theoretical and a long way from practical application.
But never say never, it will come, it’s just a matter of time but for now backlinks are still hugely important.
How do I know that?
Well lets look at Google’s recent attack on blog networks. A number of well known blog networks have seen mass de-indexing and some have closed their doors.
The type of networks that have been hit are those that mass distribute articles to the blogs loaded with backlinks so in practice they are basically a huge networks of autoblogs with thousands of posts and tens of thousands of outgoing backlinks.
These networks were very popular simply because they worked. Why else would Google bother to put so much effort into killing them off if they were not effective enough to “manipulate” Google’s results.
So even Google (indirectly) is telling us that links are still important.
Now lets look at Google’s most recent publicized changes from March as published on Google’s own blog. You can see full listing at
http://insidesearch.blogspot.ca/2012/04/search-quality-highlights-50-changes.html
There are 50 published changes but I focus only on few that I think are important to this post.
It’s important to understand that Google do not tell us what their algorithm does or how it does it or any detail on the effect of changes so any interpretations are my own.
Better indexing of profile pages. [launch codename “Prof-2”] This change improves the comprehensiveness of public profile pages in our index from more than two-hundred social sites.
This sounds like they are indexing more information on more public profile pages of social sites like Facebook and Twitter. Social standing or visibility on social networks is one of the big future measure for ranking. Clearly in the same vein as “votes” by backlinking, the Diggs, Likes and Tweets etc. are also votes for the web page content. The profile of the source (may not only supply a backlink) but may be judged (similar to PR) to decide the “authority” of those votes.
High-quality sites algorithm data update and freshness improvements. [launch codename “mm”, project codename “Panda”] Like many of the changes we make, aspects of our high-quality sites algorithm depend on processing that’s done offline and pushed on a periodic cycle. In the past month, we’ve pushed updated data for “Panda,” as we mentioned in a recent tweet. We’ve also made improvements to keep our database fresher overall.
So the old Panda again. More changes of course but worth understanding how it works. I like to think of Panda as more of a penalty rather than a measure. I.e. you don’t really get rewarded for good content but you do get penalized for bad content. Although all pages of a site are checked for “quality” the whole site is penalized even if one page falls foul of the Panda filters.
The penalty is the set against your site and is only changed every few months. So improving your site will not have an immediate positive effect. It won’t change until the next “Panda Refresh”.
These checks are clearly “robotic” so Google has started to be able to measure content quality and will only get better.
Improvements to freshness. [launch codename “Abacus”, project codename “Freshness”] We launched an improvement to freshness late last year that was very helpful, but it cost significant machine resources. At the time we decided to roll out the change only for news-related traffic. This month we rolled it out for all queries.
It makes clear sense for news items to be fresh. Hey, who wants hear old news. Now you need to be looking at your own content. It has long been accepted that adding regular additional content to your site is a good thing. It is now likely to be bigger factor in your rankings.
We don’t know how Google measure and use this. Maybe a new post to your blog will give all your pages a boost, or more likely a lack of fresh content give you a site wide penalty. The question is should you update old posts?
Interestingly blog and forum post date detection is one of the recent updates (below)
Improvements in date detection for blog/forum pages. [launch codename “fibyen”, project codename “Dates”] This change improves the algorithm that determines dates for blog and forum pages.
Tweaks to handling of anchor text. [launch codename “PC”] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust.
Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website.
There are two updates relating to anchor text. I find this interesting both as more evidence that backlinks are far from redundant, and in light of the recent discussions around an “over optimization” penalty (quote by Matt Cutts below).
There are obvious “over optimizations” like keyword stuffing and mass backlinking but I see things like variations in anchor anchor text and content relevance becoming more important.
For instance it would be “unnatural” to expect everyone to link to your site with your chosen keyword. It would be much more natural to link with a variety of anchor texts including apparently unrelated ones.
Content or contextual relevance may also be important. Again not every single one though, as that would be unnatural but mainly links relevant to the content. This may be stepped up from “better to be diverse” to “penalties for not enough diversity”.
“And the idea is basically to try and level the playing ground a little bit. So all those people who have sort of been doing, for lack of a better word, “over optimization” or “overly” doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little bit more level.
And so that’s the sort of thing where we try to make the web site, uh Google Bot smarter, we try to make our relevance more adaptive so that people don’t do SEO, we handle that, and then we also start to look at the people who sort of abuse it, whether they throw too many keywords on the page, or whether they exchange way too many links, or whatever they are doing to sort of go beyond what a normal person would expect in a particular area. So that is something where we continue to pay attention and we continue to work on it, and it is an active area where we’ve got several engineers on my team working on that right now.” – Matt Cutts
Remove deprecated signal from site relevance signals. [launch codename “Freedom”] We’ve removed a deprecated product-focused signal from a site-understanding algorithm.
This was an interesting one and although not strictly in the same vein as the above I thought I would highlight it anyway. I have no idea what this means exactly but you should take note if you have e-commerce sites or Amazon sites or any sites that are “product” based (and that may even include mini-sites with product name in the domain).
Check your rankings and income levels. Have they changed in last few weeks? If they have and you can see no other reason then this may have effected you.
My thoughts are that e-commerce sites for instance would need a different measure (of SEO) than an informational site. Especially for content. I guess that Google uses measures to decide when a site is “product focused” and hence applies a slightly different ranking algorithm.
For instance normally duplicate content would be filtered from the search results but for e-commerce sites duplicate content would equate to competition which would equate to lower customer prices. Not something Google would want to filter out.
So in summary, what do you need to be doing.
To be fair to Google it’s pretty much what they have been saying for a long time but now Google are better at detecting and penalizing those that do not step up to the mark.
- Create good quality, unique, useful and relevant content.
- Add fresh content regularly building authority and age.
- Focus your site pages to one subject and make it clear to Google what your page is about.
- Do not over optimize I.e. keyword stuff, mass link with poor quality links.
- Make your backlinks and anchors varied and “natural”. Include inner pages not just home pages.
- Ensure social exposure across many networks – but naturally of course!
- Don’t overload your site with ads, especially above the fold.
It’s all about quality and variety. Quality content, natural links and a great visitor experience.
Backlinking will never be totally depracated. Even in terms of pure theory, there’s no way to measure “great content” and then sort things appropriately – if there are a hundred thousand pages about a given topic, and dozens of high quality, well written pieces, how do you determine which is the best?
Google is like a pendulum swinging from one extreme to the other, and it’s honing in on a more accurate algorithm over time. It’s searching for the perfect balance between on page factors (content) and off page factors (links) to accurately and appropriately rank the best information. So if you write something well and get it exposure through backlinks, you’ve got a shot at long term success.
For what it’s worth, I’ve seen a significant increase in traffic and income over the past couple months, although articles I had written long ago on Associated Content (now Yahoo! Voices) saw a dramatic drop in traffic from Panda.
Hi Brian,
I am sure you are right and some kind of “popularity” measure will always be needed. I do think though, that social sites will begin to supply a higher influence on rankings as they are becoming much harder to manipulate than self hosted sites and that must be good in google’s eyes.
Tony
Thanks for this Tony, appreciate you putting this together!
Tony nice post!
I would like to add that diversity is important. Never ever rely on only one type of backlinking or website.
If one method or one site goes down at least there are other things for backup.
Building a list, web 2.0 presence and social presence is also important. These are traffic sources that supplement organic traffic.
Although there are scares of some article networks going down or dropping in rankings I still do get traffic from Ezine articles and Go articles etc. Thing is to keep building. If you have enough virtual properties out there the crowd can’t miss you.
An interesting article. I was reviewing some of my clients backlining strategies today and thought of the exact same question as you covered above; if these networks (BMR in particular) were not effective then why would G use up so many resources to out them from the index? Truth be told they worked. Also worth consideration is that 200 words of unique content is all it takes to harness a good quality link on an medeocre domain – food for thought for the future…
Hello,
Thanks for this great post and tips. However, I would appreciate if you can give us some hints and tips on how to make backlinks and with which techniques. However how to solve the “unnatural and artificial links” we are getting after hole panda 3.3
Thanks
Fadi
Hi Tony
An interesting read.
The Panda updates have left my 20 minute blog sites largely unaffected.
What I did notice is that google are putting more paid inclusion and shopping result links above our beloved blue links, thus pushing them down the page a little. This is a move that has been predicted. Also they are going to try and answer the questions at the top of page one instead of relying on Wiki answers. So, less likely to have your sites that are up there clicked on. Product ads from google and questions answered straight away means less surfing.
This concerns me more than what has gone before. I posted a “discuss this” entry on the Warrior Forum. It got deleted by the mods in ten minutes. I suppose they dont want to alarm people until we see the devastation or lack of it?
One thing that heartens me though. I have been having success with obscure subjects. When finding them you look on the google results and see no ads, go for it as long as the searches are there you will get visitors and clicks.
Hi Mark,
Yes it’s been satisfying to say the least tat 20 Minute Blogs and the backlink network are (at this time) unaffected by the changes over the last 12 months.
I am however now advocating building any successful mini-sites into more authoritative sites for the long term.
Yes we are definitely seeing more shopping and local results on page 1 and I expect to see more videos and news results etc. And more Google ads! Some of this is Google having a better understanding of the search query. Clearly there should be different results from a keyword query that start with different requests i.e. Buy, Find, How To, Where is, how big, how much. Until recently they would have returned very similar results. It may be time to be looking at new kind of keyword research.
Tony
Thanks Tony, very good insights you share. Just a mention on this point: “This month we turned off a classifier related to anchor text”.I have a blog with a 4 year old domain name that was previously used by a dog breeder. I have owned the domain for a year and the entire site has been completely changed to a natural health site for humans – nothing to do with dogs at all.There are still some links pointing to this domain using the breeders Kennel name. AND this domain still ranks No. 1 (today) for that Kennel name even tho it is nowhere to be found on my site and hasn’t been for a year.
Hmmmm …
Hi Lisa,
What you describe is one of the issues with predominantly link based ranking and proves that links are still a very powerful part of the ranking algorithm.
You may well remember the famous incident where a campaign of backlinking made George W Bush return in the search results when you typed in a certain “insulting word”.
Cheers
Tony
Excellent article. Congratulations. You played and you described a professional manner and complete the latest updates from Google
Nice use of provocative language to lead into your post!
SEO isn’t going anywhere. Without it Google would have no way to determine “where” to index a page. It uses these “signals” to determine how relevant a piece of content is for a users search query. Without SEO it would be VERY difficult to organize the trillions of pages of content on the web. Some SEO tactics may become less effective while others may strengthen in importance but to suggest SEO may be dead in my opinion is incorrect.
Same goes for links. As time goes by Google will be able to better identify the “quality” of a link based on the site it’s coming from and the relevance of the content from which the link is embedded.
The days of gibberish spun hundreds of times on a blog or site that has nothing but links that are completely unrelated is “DEAD”
Hi Mark,
I’m not really saying SEO is dead. I’m saying saying Google wished it was and will continue to try and make it so. SEO (by webmasters)is thebiggest thorn in Google’s side
Matt Cutts says specifically
“we try to make our relevance more adaptive so that people don’t do SEO”
By that he means (IMO) people should write content. Google will decide the rest. Now I know they can’t do that today but technology and capability grow exponentially. 2 years from now who knows?
Certainly I believe you will see a quantum shift in how SEO is done over the next couple of years. Our only saving grace is that Google is now so big it cannot move very fast. However if Google can find another leap in the technology or ideas, like the one they made when they created themselves – who knows!
Hi Tony,
Thanks for the great article.
I really enjoyed reading your post Tony. From the beginning SEO has always been about a) creating a website that people want to use, and b) exploiting loopholes in the algorithm. Google created this game and there will always be ways to gain an advantage. The problem lies in the belief that what worked last week will work forever. Google changes the rules constantly but the game remains. Proper SEOs love the challenge. For me SEO is more about testing than anything else. I’m hearing the buzz at the moment that SEO is dead. The people that say that are the ones that jumped on the bandwagon when SEO was hot and they pretended to be a gurus. Now they’re out of their depth they blame Google and say SEO is dead. Fine, carry on, go and promote the next big thing. Leave SEO to those that know and love it!
I rarely write remarks, but i did some searching and wound up here
Latest Google Changes How Do They Affect You?. And I actually do have a couple of questions
for you if it’s allright. Is it simply me or does it look like some of the comments look as if they are left by brain dead folks? 😛 And, if you are writing on other social sites, I would like to keep up with anything new you have to post. Would you make a list of every one of all your communal pages like your twitter feed, Facebook page or linkedin profile?
Feel free to surf to my web page :: world-cccam