What doesn’t kill you makes you stronger! That’s definitely true when it comes to link building and other SEO techniques.
For SEO that means moderation in all things. Google are repeatedly talking about unnatural link profiles and natural anchor text profiles and other apparent unnatural activities.
I want to focus on backlinks or off-page SEO in this tip but much of what I will be talking about is also relevant to on-page SEO.
In my previous tip I shared all the raking factors that are likely to affect your sites.
Actually this is a list that has been compiled from data gathered by Moz. Moz took a “cross-board” selection of Google search queries and asked a number of top SEO experts to assess what rankings appeared to help or hinder the sites in search result.
As such this is very likely to be a good representation of how beneficial any particular raking factor may be for any web page.
Understanding Power v Penalties
It is important to note that these results can only be truly used within the confines of the actual search results. The web pages that Google like are the only one’s that will rank high so any major negative factors that may cause penalties are not likely to show up in the data from which this list was compiled.
So although it may show that having exact anchor text has a beneficial effect, it won’t show if “too much/many” exact match anchor text will have a negative effect.
However you can find that information by, once again, looking at the first page of the Google results. You will have figured out by now, that is where all the SEO secrets lie.
If you check out the ranking factor list above you will see that many of the top factors are link related
To understand exactly how you can figure out if a ranking factor may put you into penalties then consider the following example.
Take this factor from the list “Number of External Pages Linking to Page w/ Exact Match Anchor Text“.
It has a ranking factor in that list of 0.27 which is one on the highest. So the more of those you have the better a site will rank. Just assume that the higher the number the better the ranking factor. The way the number is calculated is pretty complicated and not really important for the purposes of this discussion.
But as I said it needs to be viewed within the confines of the known data. That data in this case is the Google top results for your keyword search. Although the ranking factor table is built on the top 50 responses, in practice you can concentrate on the top 10.
What You Can Deduce
So you need to look and see how many “External Pages Linking to Page w/ Exact Match Anchor Text” each top 10 result has. If, for instance, they range from 26 to 82 then you can assume a number of things.
1. If your page falls within that range it is safe from penalties (from that raking factor)
2. Your site will likely rank higher if you add more of the same type links up to a max of 82
3. More than 82 of such links may put you in penalties.
4. Less than 26 links are unlikely to get your page in the Google top 10..
If you now do that for all the possible ranking factors then you would create the perfect profile for a Google top 10 web page. If you now take all the same factors on your own web page (the one you want to rank for the same keyword) you will be able to see exactly what you need to do to your page to bring it in line with the top 10 results. Moreover, you also have some thresholds that will ensure you do not go into penalties.
Unless you are already in penalties in which case you will see what is most likely causing it and allow you to fix it.
What if Google then change their algorithm?
Well the top results for your keyword search would change. The pages Google now dislike will drop and the ones they favor will rise. So by by repeating the above analysis all over again you get a new “profile” for a top 10 web page. Compare your page once again to the new top ten results.
Simple right? Well yes.. BUT unfortunately pretty time consuming. To find the top ten results for just one keyword and then check the ranking factors on all top 10 sites is a mammoth manual task. In fact pretty impracticable if not actually impossible.
There are two main things you can do to make this manageable
1. Reduce the amount of ranking factors you check
2. Automate the checking
Obviously the less factors you check the less accurate your results will be. On the plus side Moz help us out considerably here.
Notice that the largest ranking factor in the table is the Moz Page Authority and another significant factor is the Domain Authority. Although not explained in the table the PA and DA are actually derived from the other link related factors. They are generated by what Moz call a learning algorithm. By using these factors in place of the individual external link factors you considerably reduce the amount of checks needed with very little loss of accuracy.
You can see more about Page Authority and Domain Authority from these links.
This still leaves a lot of factors to check so using software is really the only viable option.
In my next tip I will tell you how this can be fully automated and the entire profiles created and compared with your web pages in a few minutes.
I hope you found this informative and enjoyable. I know SEO can be a complicated subject at times which is why many “experts” find it difficulty to share information beyond the basics and a few tricks. It’s also why many people trying to make money from SEO can find it a struggle to actually rank their web pages and keep them ranked.
If you want further explanation on this is any other IM relates subjects please let me know in the comments below.