I haven’t posted for a while on my blog.
Shame on me……………
I’ve been so busy updating WP Pipeline and working on my latest product with Anthoni Gardner (More about that in another post) that I have had zero time for anything else.
But that’s my sob story
So just wanted to chat a bit about trending keywords. What are trending keywords?
What that really means is that you find keywords that no-one is really using or searching for yet. That way Google thinks they are of little value, no-one is targeting them and you can easily rank for them. Of course they have no traffic and you will make no sales so what’s the point?
That’s where the trending bit comes in. Trending keywords are those that have no searches now but sometime in the future will have a lot of searches. At that point there is a “sweet spot” where it is easy to rank on Google page 1 but there are lots of valuable searches.
In some cases a huge amount of searches. A little while back I was using a technique with trending keywords to target a lot of different things. New product launched were very successful. Like $500-$1100 in a week for one 4 page blog.
My Royal wedding blogs went crazy and I made most money in AdSense I have ever made in one month from just 3 sites. All on the same subject (Kate and Williams wedding).
OK, it was great money and pretty easy to do. Like all things it had a downside. Trends don’t usually last very long. If they do the big guns come out and try to take over the niche. Either way you end up with no traffic or a lot of work to keep your rankings.
BUT, and it is a big BUT, in the short time that you can rank highly and the traffic spike hits you can literally make thousands of dollars from on site in few days.
For me it was not a long term business model as this is is my full time job and I needed a more consistent income that I don’t need to be re-createing all the time.
AS a part time income strategy it is pretty cool though.
What spurred me to write this post was a new WordPress plugin I saw called Auto Traffic Trends by a guy called Michael Young.
He created WordPress plugin that basically used all the (now available) trending searches information from a lot of places on the web including Google. The plugin found the trends, build basic video and Amazon review pages using the trending keywords and auto posted them to your blog along with the right Amazon product and AdSense.
He seems to have found away to create perpetual “trending” posts. That way he has overcome the issue of the trend “fade” that makes these things usually short lived.
So my thoughts were, if you have a few jaded blogs that are not making you any money then why not try this plugin and see if it can ramp up your traffic.
Michael and his guys are also giving a free Webinar to show exactly how to put this plugin to best use so I highly recommend you attending that or at least watching the replay.
You can see more about Auto Traffic Trends by clicking this link
You can grab your free copy of Google Pirate from here
In Part 2 I want cover some of the things you absolutely need to
understand when deciding if a web page is actually high quality.
Google’s latest Panda update will most certainly be looking to
penalise more “low quality” sites so it is imperative you
make sure your web pages are truly high quality.
In few days I will also post [part 3] that covers identifying
web spam and other things that will put your pages into penalties.
I have created an extensive tutorial on this whole subject of web
page quality based on the the published (and unpublished) guides
and guidelines of Google. It’s available at the link below. I
don’t normally produce products at this low price but I want as
many people as possible to take this one up. Without this
understanding every website and web page you ever create will an
uphill struggle at best.
I spoke last time about User Intent. That means “what does the
user actually want to do when they type in query in the Google
search engine”. This is crucial that you understand this. If
the web page the searcher lands on does not “fulfill” the
user intent the they will simply move on elsewhere. Ads, sales
pitches, affiliate links, opt-in forms will all be ignored as the
simply “BackPage” to the search results to try something
When I say “User Intent” for the search query I really
mean your target keywords. The keywords you are expecting people
you find your site with
There are three types of user intent.
1. Action (want to something like take a test or compare prices)
2.Information (want information about something like how high is
3.Go (go to a specific page i.e. Google homepage or AdWords Keyword
These are know as ” Do-Know-Go”.
Of course you need to identify the precise user intent, for
instance the keyword “Makita 2350 Power Drill” is most
likely to have a user intent to purchase that item (Do) . While
“Makita 2350 Power Drill spec” would be looking for the
technical specifications of the product (Know) and Makita”
would probably be looking for the manufactures web site (Go)
These “intents” are quite specific. So let’s take a
typical example of the mistakes webmaster make with this (you may
Your keywords “Makita 2350 Power Drill” is for a web
page that gives an in depth description of the technical
specifications of the product. Even offer some great information
on how to use it, safety tips and explanation of many of the
add-ons and attachment that are available. Lots of pictures of the
product and even a cool video of it in action. The page may have
taken you days to complete. You’ve added a couple of AdSense
blocks on the page, nothing too intrusive.
Is this high quality page? Quiet Simple NO
It does not fulfil the user intent. The user (of your keyword)
wanted to buy a drill NOT learn all about about a drill. It is a
classic error of sending a “Do” query to a “Know” page
(or any other combination). Not fulfilling the user intent
means all your good work on that page has been wasted.
Google will not rank a page for a keyword that does no fulfill the
Next think about web page “Purpose”. This is NOT the
same as user intent when assessing your web pages. To assess the
“content quality” you need to ignore user intent. Look at
a page in isolation and decide what the purpose of the page is.
i.e. to sell something, to calculate currency exchange, to show the
latest news items etc.
When you understand what the page purpose is then you need to
ensure it is in fitting with the overall objectives of the site. So
what is published in “about us” and what you can see on
the home page. This is why “niching” your site is a good
idea and sites with “random” subject posts or content do
not do well.
Looking at “content” quality specifically you need to
identify the main and supplementary content of the site. i.e. the
main content will be that which achieves the purpose of the page.
Supplementary content helps and adds value to the page purpose.
One common mistake is to have no supplementary content or
supplementary content that is unhelpful.
Of course you need to look at all the usual things you would think
of as quality content” like spelling grammar, does it read
well? is it interesting? do you believe it? Would you trust your
credit card to it?
There is an awful lot more to assessing the quality of web page (in
Google’s eyes) and I will cover some more in my next email.
However I cannot possibly cover everything in emails so I have
produced an in-depth tutorial of over 100 pages (not for the
faint-hearted) plus self -assessment check-list/forms you can use
on your sites or even your clients. It is the very basis for all
good websites. If you intend to be successful with SEO then you
must understand and apply this information. There will still be
work to be done after this but without it you could literally be a
non-starter. It may actually be the very reason why many of you
just do not see success with your web sites.
You may or may not be aware but there is new round of major Google updates coming. The first is a new Panda update that will take effect tomorrow (Friday 15th March) or maybe Monday, according to Matt Cutts.
There is also talk by Matt of a new Penguin update and an frontal attack on a link network in the pipeline, but no dates as yet.
There have been many Panda and Penguin updates since those algorithm changes were first announced but the next round is likely be much harder hitting that we have seen recently.
Aimed at hitting poor quality web pages and so called “web spam” (anything that Google thinks you are doing to manipulate the search results).
Most of you will already know that great content is now the name of the SEO game. The best way to succeed and benefit from Google massive search engine is create great sites for real people. First and foremost that means quality content.
The biggest challenge I have seen is judging quality content. It may seem odd but I find most people just do not take a real view of their own web pages. OK, there is bound to be some bias after all it’s your baby, all your hard work. It’s hard to be self critical and brutally honest when assessing yourself.
Truth is though most people do not actually know what “quality content” is. Don’t take offence I don’t mean your a bad judge or anything. I mean Google keeps their quality measures so close their chest that all we really do is “best guess” what they want.
Sure we can test and analyse things. I run a lot blogs , mainly for test purposes. I purposely give them poor content, bad link profiles, overdose with ads etc. to see what actually forces web pages or web sites into Google’s penalties (and how to get out of them!).
Why would I do that? Well the way I see it is that it’s fine adding good content and trying to follow the Google rules but it only takes one mistake to get your site penalised and find it sat at position 600+
Even very minor mistakes can see your site drop from a good page one position to page 5 or 6 overnight.
These minor mistakes are what Google mean by web spam. Basically anything they do not want you to do.
To be fair they have never wanted you to do it but they are just getting much much better at spotting it and penalizing it..
Through Panda and Penguin Google have been able to use human reviewers to “rate” web page quality in detail. This information gets fed into the robots and that way they can implement automated penalties for this low quality content or spam sites.
In fact I would be surprised if you know all the things that constitute web spam. Certainly nothing to do with email spam.
Likewise did you know that you cannot assess page quality just by looking at it?
That’s right! Users have something called intent. Web pages have a purpose. Fulfilment of intent and achievement of purpose are the fist steps to assessing a pages quality.
Can you distinguish between main and supplementary content?
Do you know their purpose
Do you know how one must help the other?
You see we haven’t even got to the actual text, graphics, videos and other actual content yet. Quite simply if you do not fully understand the foundation judgements you must make before assessing quality then you just won’t get it right.
That’s proved by the shock and confusion of webmasters everywhere as all their site plummet in the ratings after just one more Google update.
Everyone feels the rules have changed but the reality is the rules haven’t changed, simply the enforcement got better.
Because all (and I mean all) SEO has been about manipulating the gaps in Google’s algorithm. Even things considered white hat really aren’t in general.
They are just things Google have not caught up with yet.
In Google’s own words
A web page created for the sole purpose of making money with little added value to the user is considered spam.
Do you know what Google consider enough added value? or even what may constitutes added value?
I’ve been working on this for months now. Since late last year in fact and it has been an extraordinary enlightening journey. Which culminated in Google releasing a cliff notes version of there web page quality raters guide a few days ago.
Sadly it was woefully incomplete and cut to the bone. However it kind of opened a flood gate because up until then any copies and reference to that document had been secret and in fact a lot of legal threats and letters went around a few times when anyone attempted to leak the document.
There a lot of stuff I was not able to share in the past. Now it’s (a bit of at least) in the open, I can now do so without the fear of Google’s attorneys coming after me.
I’ve been talking about authority sites for some time now and it really is the way forward. Building authority sites is your only protection against future Google changes. Like I said the rules haven’t really changed just how well the are enforced and that will continue (and very soon). So that means if you have all the basics in place. Quality, up to date content, great site structure, good keywords, good quality backlinks and a natural anchor text profile then you will have the basis for a web site that Google loves and will reward.
How you go about ensuring that your content has purpose, is of good quality, serves the user intent, is spam free and your site is reputable will be the subject of part two of this post in a few days.
First off lets be very clear
Google is killing of low content, low quality sites.
What they refer to as “Authority Sites” are the new way into
Google’s top rankings.
Check out any top 10 search these days and you see a lot
more authoritative sites and very few (if any) small sites.
I have put together an explanation of authority sites and what you need to be doing to keep up with Google right here in this post.
So how do you create authoritative sites?
Great content, Great content & More great content!
But where do you start? As always, you start with keyword research.
Just as important today as it has ever been but you need work smarter.
Authority Sites are the way forward for SEO
Authority Sites will stand the test of time and stay ranked
Authority sites will rank new pages fast
Authority sites will rank “hard to rank” keywords much easier
Authority sites will rank for keywords you never even targeted
Authority sites are money in the bank
So you need to start thinking differently.
Forget fast rankings
Forget targeting one keyword per page,
Forget only five posts on a site
To some degree forget about keyword competition.
Start to build authority sites or start make your current sites into
authority sites. Focus on adding interesting and useful content to you sites. Focus on content quality and visitor experience. I know the old idea was to get them to your page, get them to click and ad. We’ll those days have gone. The new way is to get them to your site, engage them with content, keep them interested and on your site, sure sell or advertise on your site, after all you are here to make money right?
Oh, and while there interested get your visitors to sign up to your list, give away some cool stuff, offer related, interesting and valuable free products. Get them on your list.
Like I said above , some of the old keyword research ideas can go out the window. Authority sites bring in most of their traffic from long tail keywords. So for instance you may target a “Valentines Day” as a keyword. What is your chances of ranking in the top ten for that?
Well with authority sites you just don’t care. Very few people look for “Valentines Day” anyway. More likely “what is great present for my girlfriend on valentines day”. And that’s the kicker. As your site grows and gains authority it will rank for all sorts of long tailed keyword phrases.
So you can happily target high competition keywords and still win. Even better, long term you may even rank for those very high competition keywords.
That also means you can target more than one keyword on a page (related of course). Think about this, if you have more keywords on a page then your backlink anchor text can be more diverse. Another winner in the Google stable, avoiding those Penguin and Panda penalties.
The next step in authority sites is LSI (Latent Semantic Indexing). Actually this is a commonly, incorrectly used expression. For current search results Google probably does not use true LSI.
It is likely that it would massively increase the required processing power and storage needed.
However what is abundantly clear is that Google most certainly does use some king “Semantically related words” measure. That means that there is an expectation that certain “semantically related” words are likely to appear in content on a particular subject.
Realistically if you read an article on David Beckham you would expect to as find words like football and Victoria etc. These semantically related words will help improve the look of your content in the eye’s of Google. So articles with these words will score higher for SEO than those without.
Understanding which words are considered “semantically related” can be a challenge. Some things are obvious (you’d think) but how do you know what Google actually think (in SEO terms).
The only real way to understand that would be, as an example, to do a Google search for “David Beckham”. Take look at the top 10 results. Go to those web pages and read them. Make a note of all the keywords that appear on those pages and see which ones occur regularly.
As Google have placed these pages in it’s first page results then it is actually endorsing the content. Whatever words are being used in the text of these top ranking pages are what Google are looking for.
So back to your authority site. The very keywords that are appearing in the top ten results for any specific search are the very same words you should be adding to your content that targets the same keywords.
Following and repeating this process is the way forward to create sites with authority. These sites will will need a constant source of content while continually gaining authority. At some point the authority itself will rank even more keywords, rank tougher keywords, new pages will rank quickly and your visitor number will grow.
These sites are much more stable as well. Newcomers don’t knock you off the top with one backlink blast.
Other advantages. Exact Match Domains are just not required, in fact not having one may be an advantage. Create a cool brand rather than a keyword. Target your keywords in the extended URL rather than the domain name.
Good SEO practices are still required, however mass back linking etc. is just not needed or even desired so the amount of work to do is actually reduced for the same results.
Finally as is a good structure is important. Try to tier your website keeping things in a “related” tiers so that Google can easily understand what your site is about and what parts are connected.
For instance, if your site is about Jewellery, the structure it in tiers of Necklaces, Bracelets, Anklets, Earrings. Under each main tier heading add Silver Mount, Gold Mount and under those add Diamond, Emerald, Ruby, Pearl. You can then add articles to each one
So your URL structure will look something like
JenkinsJewllery.com/earings/silver_mount/emerald/how_to_ select_the_best_cut_for_your_ face shape
A very clear indication to Google as to exactly what should be on that page.
Then add compelling content and few Semantically related keywords for the perfect post.
I have created a cool new plugin “Authority Site Keyword intelligence Tool” that allows you find all the authority keywords you need in minutes.
have put together one one the best a biggest bonuses I have ever offered.
TO CLAIM YOUR BONUS OPEN A TICKET AT http://tonymarriott.info
Tell me your ClickBank Purchase number. It’s as easy as that.
SEE BONUSES BELOW VIDEO
YOU MUST buy IncomeV through the link on this page to be entitled to the following HUGE BONUS.
(1) Lifetime Access To My Ultimate Training Program DEATH OF A DAY JOB
Over 40 hours of video training showing you exactly how to build a real online business.
Step by step guides, templates, tutorials, case studies, Affiliate marketing, Product inception and creation, Product Launch, JV partners, List building, Email management + MUCH MORE
How do the successful internet marketers make their living? – JUST LIKE THIS!!!
(2) PLUS PLUS PLUS this huge package of 20 Internet marketing ebooks
If you want quality you won’t get better than the full blown training in Death Of A Day Job. If you are someone who likes plenty of quantity in their bonus packages the you have come to right place. 20 ebook tutorials covering all popular areas of internet marketing.
Click the link below to checkout or buy IncomeV. You must use this link to be able claim your bonus
The Google Penguin hit on 24th April and has been one of the biggest impacts on web masters, just about ever. If you build websites and have been following the current SEO advice the it very likely that you will have been hit by Penguin.
It was a change by Google clearly designed to hit people (sites) who “do SEO”.
I won’t get into the Google/SEO argument here but you can find more of my views on Google’s thoughts in other posts here on the blog
What I want to do in this post is explain what Penguin actually is and, importantly, what is not. Also to demystify Penguin v Blog Network.
First off you need to decide if you were hit by Penguin and those I pretty easy. The only real measure is if your site took a dive in the rankings on or just after 24th April.
Penguin is a site wide penalty so if you are hit you will likely lose ranking for all your keywords.
So what is the Penguin Penalty?
After looking at a lot of sites and talking with a number of other top marketers who have been making the same analysis we have come to a clear conclusion.
Firstly Penguin is an algorithm change so it measures sites dynamically. So the good news is that if you fix your problems you should see an automatic return of your site.
Penguin is about two things, or at least there are two factors that seem to dominate the results wee have seen. Penguin is all about your site link profile. Importantly it is NOT about site content. There are no changes on your site that will help you get back from a Penguin hit.
Panda is all about site “content quality” so although content is very important that is related to Panda issues and NOT Penguin.
The two things that affected by Penguin are:
1.Backlink anchor text ratio
2.Backlink relevancy ratio
I’ll explain that a little more.
A sure sign that SEO is being done on a site is that a lot of the backlinks will have the same anchor text. It is standard practice that if you want to rank for “diet pills” then you will use “diet pills” in your anchor text.
This approach is still very valid. In fact all previous SEO methods are still absolutely valid but they need modifying to meet the new demands from Penguin. So backlinks and anchor text are still the most powerful ways of ranking a website.
However you need to make sure that your backlinks have much more variation.
Lets talks specifically about anchor text. Looking at my own sites, network sites and reports from other marketers it is very clear that a high percentage of “same” anchor text has had a detrimental effect. In one report I read it showed that (in the sample) NO sites that had less than 50% of “keyword in anchor text” were hit. Don’t take that to mean that you can potect yourself with just one step but is shows how powerful this measure is.
Remember of course that, like Panda and other earlier changes, Penguin will get some tweaks so you need to look at the “spirit” as well as the “ facts” of the coat-hanger.
Just to clarify you should use any term in your “keyword” when looking at this measure. So if you are targeting “diet pills” then both “diet” and “pills” would be considered keywords. Also if you use an exact match domaon and target “diet pills” then www.dietpils.com would still be a keyword focused anchor text.
So you need to have 50% of your keywords to have no anchor text or anchor text unrelated to your keyword. So Click here, no anchor, unrelated keywords. Url if it does not contain the keyword.
The second factor is Backlink relevancy. By that I mean is the site or page that your backlink comes from relevant to your web page. Clearly if your is about “diet pills” and your backlink comes from “New York Attorney” then that would not be relevant.
However the measure of relevancy is likely very wide. It is just not possible to be exact in this kind of measure, either for us or Google. Some things are obvious, some less so. For instance if the “New York Attorney” page was about suing a pharmaceutical company or a diet company then I guess that would be relevant. As would likely a link from, say, a site about anti-ageing cream to your page about nail care. They both sit in the beauty niche. Of course relevancy is subjective.
So looking at my stats and again, listening to other marketer who have a large number of sites you need to make sure that at least 15% (better 20%) of your backlinks come from relevant sources.
There is a third factor you should take into account. I don’t have hard facts to support this but, looking at the two factors already mentioned it makes logical sense tat this is, or will be a factor.
Ratio of do-follow/no-follow links.
Again this fits right in with diverse backlink portfolio, something I have always recommended.
There is certainly some indication that this will help. So make sure you have at least 10% of your links no-follow.
If you achieve the above link ratios then it is highly likely that you will get out of the Penguin penalty and see you sites return to the rankings.
Of course SEO is not an exact science and no-one knows exactly what Google’s algorithm does so nothing is guaranteed. What I can say for certain tough is that if you have been hit by Penguin and you DO NOT follow these guidelines and you will have zero chance of recovery.
So what should you do?
you need act right now. Firstly decide that your site hit is Penguin. If you don’t track your keyword rankings then you need to start now. That is usually the only way you will know if you get hit by Google’s changes (or at least which one, if you do get hit).
Next you need to look at your backlinks. Unfortunately most backlink data sources are now paid but some like MajesticSEO give you your total links. Whatever tool you use to find your total links, stick with it. You cant use one tool one week and another one the next week. Their results vary wildly. Don’t worry about who is the must accurate just pick one and stick with it. This is only for comparison not an actual exact measure.
If you have tools that tell you exactly what links you have, if they are do/no follow, your anchor text and her they come from then that’s fantastic. If you don’t then this is the plan to follow.
Now you need to correct the link ratios. Do not do this by trying to alter or remove existing links. Do it by adding new links.
When you know how many links you have you can then estimate how many you need to add of each type. Now it is likely that you have added links based on the old recommendations i.e. do-follow, keyword in anchor. It’s also likely that you have grabbed links from wherever you could get them. So to start out just assume that all the links you have are do-follow, keyword in anchor text and from a unrelated sources.
Lets say you have a total of 100 links. The to correct your ratios you need to do the following
1.Add 50 links with no or unrelated anchor text
2.Add 15-20 links from related sources
3.Add 10 links that are no-follow
If you work clever you can get the relates source links and the no-follow links in your 50 no anchor text links. So in practice you would only need to add 50 links.
Of course if you have sites that are ranking well or building new sites then this is even more important. Make sure that you keep your link portfolio varied. There are no specifics stats to give any particular ratios but in general it is just a good idea to use as a many different sources as possible like blog comments, video, free blogs, Squidoo lenses, Hubb pages, Pinterest, article directories and social bookmarks etc.
If you need some help in finding backlinks the try my Link Finder Pro link finding tool. It’s great for finding backlinks from relevant sources a and it comes with a detailed ebook about links and link building
You you can see it in action at the link below
For those of you who are using the Backlink Network I will be making some changes to better suit Penguin. Free (Gold) backlinks will now be distributed with anchor text only 50% of the time. This will help all sites both new and existing.
There will be no change to Platinum or Exclusive links they will have the same anchor text that you set.
Final note about Backlink Networks is that you should not use it as your only source of links. Like all link sources it should only be one of many. Diversity, diversity, diversity.
Tell us about your experiences with penguin by adding a comment below – and getting a backlink of course!
Anyone who builds websites will be following Google’s continued algorithm changes. This last year seems to have had a huge impact on people and their site.
The move to “Measure Quality Content” and the more recent attack on blog networks shows that Google is ramping up it’s battle against what it sees as poor quality web sites or those that don’t obediently follow Google’s own “rules”.
The latest changes are firmly set in the same vein and are aimed at targeting those that don’t comply with the Webmaster Quality Guidelines. The Guidelines have been around for a long time now but presumably this is a better way to identify the so called malpractices.
You can see the full details here
but let’s just look a few quotes from the Matt Cutt’s post.
opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.
So again Google are referring to over optimization and anything that they would consider to be unnatural linking. From that you can take heart that backlinks must still be a major measure of site importance and will continue to be so in Google’s eyes. Else why would they be on such a major offensive?
Matt explains some changes they have already made. The fact that these are specifically mentioned likely means they are of high importance. Panda (for most of you) will be obvious but you may not be aware or have considered the importance of “less ads/more content above the fold”.
To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”
So then onto the latest onslaught
In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.
As I said earlier it has always been a requirement to adhere to the Webmaster Quality guidelines. It even part of the AdSense T&C’s that you do so. I guess this just means they are now better are spotting offenders and will be more aggressively dealing with those offenders.
our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.
This final comment is one that you should remember and it is something I have mentioned before. Google’s goal is to have no SEO at all. In a perfect world you would simply create a site with great content and the Google search algorithms would do all the work and decide how good and how relevant your site is. Unfortunately for Google that is still along way off and they need some level of SEO to help them make sense of website content.
We want people doing white hat search engine optimization (or even no search engine optimization at all).
So what do we conclude from all this.
Firstly Google are aggressively attacking all methods of SEO (or web spam or any numbers of other names depending on where you position yourself in the SEO camp) that they feel messes with their algorithm.
Secondly Google seem to have made a quantum leap in the ability to robotically detect, poor quality content, bad link practices, over optimization etc. Expect this to grow exponentially. My guess is that the Panda idea of having humans assess the site content quality and then feed all that data into a program and tweak it until it gets similar results has been a programmatic success for Google. By using the same techniques (AI?) for other areas website SEO is a logical step.
Thirdly I think this is one of those defining moments, a point in time where SEO techniques needs to change dramatically for a lot of people. Creating websites needs to be about quality and useful content. On site SEO needs to exist but it needs to be simplistic and more general. By that I mean NOT making sure you have a certain percentage of keyword density, by NOT focusing on one keyword to the detriment of all other. It means creating focused content of course. Your website should have a subject and all the content should be structured and relevant but you simply need to focus on the quality of the content, the readability of the text and relevant, descriptive titles. Googlebot is clever enough to know what search expressions should appear in the results.
Those of you who have specifically been creating mini-sites, one page sites, MFA sites, sniper sites or other highly targeted (and highly successful) small websites need to be making changes now. That includes 20 Minute Blogs.
Review your content for over optimized keywords (0.5-1% max i’d say).
Review your content for quality. I mean does it read well, is it grammatically correct, is it relevant is it unique and is it useful. Edit it, update it or even scrap it if it’s not up to scratch.
Add fresh content a new post every 2-4 weeks perhaps.
Get a few good quality backlinks from good sources.
Make sure you also backlink the inner pages of your site.
Vary the anchor text of your backlinks
Work over time to make your site into an authority site.
The days of the small highly SEO’d sites are numbered. You cannot expect to put up a site with a couple of posts and then do nothing more to it and expect it to stay up in the rankings.
That doesn’t mean you can’t build good small sites it just means you must continue to build on the successful ones to turn them into long term, profitable and authoritative sites.
Love to hear your thoughts and experiences so please leave a comment below.
I know many people still find it difficult to discover good niches to target.
There is lots of help about but all the different ideas and teachings can sometimes just muddy the waters.
You may also hear that there are no niches left!. It’s all tapped out!
Sure it it can take a bit more looking these days but there are still may more micro-niches left to make money from.
So for those unbelievers and anyone who needs a helping hand I have recently come across a niche that is truly untapped. I found there are loads of high value keywords with plenty monthly local searches and even lots of exact match domains available.
So you are welcome to try for this niche yourself or just use it as an example to see that there are untapped niches or niches that still have plenty of opportunity to make money in, even for newbies.
I won’t be doing this niche myself so you won’t have any competition from me but the quicker you get up and running the more likely you are to make a success in this niche.
The niche is unusual but that is one of the secrets of finding profitable niches. I see 100s of new blogs every month and almost all are in the same old niches. So forget diet and weight loss, acne, make money, fitness, getting your ex back, tinnitus, and many of the less savoury subjects like piles and warts. Try to think a little outside the box and go where others have not. That is where you will find the little gems that can make you some cash.
Don’t worry though, if you are really stuck, or just need something to get you to take action. I have done the research for you. I will give you loads of valuable keywords, lots of exact match domains.
I’ll even give you your own niche product with Resell Rights so you get up and running in no time flat. All you need to do is put up a WordPress blog with a few interesting articles and I’ll even give you some to get your writing juices going.
Best of all everything is absolutely free and no sign up required. Just my gift to Online Income Club visitors.
Google seem to have shot themselves in the foot a bit with their link based ranking search algorithm.
In it’s day it was the best by such a huge margin that it shot Google to the top of the pile and made it firmly the worlds Number One search engine.
In fact I doubt that any other company in the world has such a big market share of their particular business as Google does.
The simple premise was that other websites would “link” to any website that it thought was good. So each link was a “vote” or a “like” for that website. The more votes or likes it had the more popular it was and so the higher up the search results it should be.
It had one major flaw though. The nearer the top of the search rankings a website gets the more people will see it. The more people that see it the more likely they are to link to it.
This means that established sites get an advantage over new sites as new sites will have no links and therefore will not rank highly in the search engines.
Now Google has done some work to help alleviate this like giving new sites a temporary higher ranking for additional exposure. However as webmasters became more savvy it was obvious that links were the way to get top rankings. The start of modern SEO was born and suddenly Google’s algorithm was becoming ineffective as 10,000’s of webmasters were able rank any webpage for any keyword just by firing links at it. The natural selection process that Google envisaged was already drowning.
So move a few years down the line and Google are wishing they could rank websites by the quality and relevance of the content (back around in one big circle!).
So Panda was born and that brings us basically up to date.
Over the last 12 months we have seen sweeping changes in Google’s approach to ranking websites.
Panda has put the focus firmly on higher quality content.
Regular Panda updates are still ongoing and have caused a lot of sites to drop dramatically in the search rankings. Most of you will already heard of Panda but I will go into it a little deeper in a minute just to be sure you are clear on how it affects your website.
Panda may be “old news” but I need to talk about it to put the other changes into context.
For instance did you know that there was Panda update in March and they will be ongoing for some time?
OK, so Google is sending a message – We want unique, quality, useful and relevant content on the websites.
Does that mean backlinking is dead?
Not by a long shot! Technology does not allow machines to judge the quality of written content well enough to work as the major factor in search rankings. In fact try searching Google and you will be surprised at how little information there is on the subject. The info that exists is mainly theoretical and a long way from practical application.
But never say never, it will come, it’s just a matter of time but for now backlinks are still hugely important.
How do I know that?
Well lets look at Google’s recent attack on blog networks. A number of well known blog networks have seen mass de-indexing and some have closed their doors.
The type of networks that have been hit are those that mass distribute articles to the blogs loaded with backlinks so in practice they are basically a huge networks of autoblogs with thousands of posts and tens of thousands of outgoing backlinks.
These networks were very popular simply because they worked. Why else would Google bother to put so much effort into killing them off if they were not effective enough to “manipulate” Google’s results.
So even Google (indirectly) is telling us that links are still important.
Now lets look at Google’s most recent publicized changes from March as published on Google’s own blog. You can see full listing at
There are 50 published changes but I focus only on few that I think are important to this post.
It’s important to understand that Google do not tell us what their algorithm does or how it does it or any detail on the effect of changes so any interpretations are my own.
Better indexing of profile pages. [launch codename “Prof-2″] This change improves the comprehensiveness of public profile pages in our index from more than two-hundred social sites.
This sounds like they are indexing more information on more public profile pages of social sites like Facebook and Twitter. Social standing or visibility on social networks is one of the big future measure for ranking. Clearly in the same vein as “votes” by backlinking, the Diggs, Likes and Tweets etc. are also votes for the web page content. The profile of the source (may not only supply a backlink) but may be judged (similar to PR) to decide the “authority” of those votes.
High-quality sites algorithm data update and freshness improvements. [launch codename “mm”, project codename “Panda”] Like many of the changes we make, aspects of our high-quality sites algorithm depend on processing that’s done offline and pushed on a periodic cycle. In the past month, we’ve pushed updated data for “Panda,” as we mentioned in a recent tweet. We’ve also made improvements to keep our database fresher overall.
So the old Panda again. More changes of course but worth understanding how it works. I like to think of Panda as more of a penalty rather than a measure. I.e. you don’t really get rewarded for good content but you do get penalized for bad content. Although all pages of a site are checked for “quality” the whole site is penalized even if one page falls foul of the Panda filters.
The penalty is the set against your site and is only changed every few months. So improving your site will not have an immediate positive effect. It won’t change until the next “Panda Refresh”.
These checks are clearly “robotic” so Google has started to be able to measure content quality and will only get better.
Improvements to freshness. [launch codename “Abacus”, project codename “Freshness”] We launched an improvement to freshness late last year that was very helpful, but it cost significant machine resources. At the time we decided to roll out the change only for news-related traffic. This month we rolled it out for all queries.
It makes clear sense for news items to be fresh. Hey, who wants hear old news. Now you need to be looking at your own content. It has long been accepted that adding regular additional content to your site is a good thing. It is now likely to be bigger factor in your rankings.
We don’t know how Google measure and use this. Maybe a new post to your blog will give all your pages a boost, or more likely a lack of fresh content give you a site wide penalty. The question is should you update old posts?
Interestingly blog and forum post date detection is one of the recent updates (below)
Improvements in date detection for blog/forum pages. [launch codename “fibyen”, project codename “Dates”] This change improves the algorithm that determines dates for blog and forum pages.
Tweaks to handling of anchor text. [launch codename “PC”] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust.
Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website.
There are two updates relating to anchor text. I find this interesting both as more evidence that backlinks are far from redundant, and in light of the recent discussions around an “over optimization” penalty (quote by Matt Cutts below).
There are obvious “over optimizations” like keyword stuffing and mass backlinking but I see things like variations in anchor anchor text and content relevance becoming more important.
For instance it would be “unnatural” to expect everyone to link to your site with your chosen keyword. It would be much more natural to link with a variety of anchor texts including apparently unrelated ones.
Content or contextual relevance may also be important. Again not every single one though, as that would be unnatural but mainly links relevant to the content. This may be stepped up from “better to be diverse” to “penalties for not enough diversity”.
“And the idea is basically to try and level the playing ground a little bit. So all those people who have sort of been doing, for lack of a better word, “over optimization” or “overly” doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little bit more level.
And so that’s the sort of thing where we try to make the web site, uh Google Bot smarter, we try to make our relevance more adaptive so that people don’t do SEO, we handle that, and then we also start to look at the people who sort of abuse it, whether they throw too many keywords on the page, or whether they exchange way too many links, or whatever they are doing to sort of go beyond what a normal person would expect in a particular area. So that is something where we continue to pay attention and we continue to work on it, and it is an active area where we’ve got several engineers on my team working on that right now.” – Matt Cutts
Remove deprecated signal from site relevance signals. [launch codename “Freedom”] We’ve removed a deprecated product-focused signal from a site-understanding algorithm.
This was an interesting one and although not strictly in the same vein as the above I thought I would highlight it anyway. I have no idea what this means exactly but you should take note if you have e-commerce sites or Amazon sites or any sites that are “product” based (and that may even include mini-sites with product name in the domain).
Check your rankings and income levels. Have they changed in last few weeks? If they have and you can see no other reason then this may have effected you.
My thoughts are that e-commerce sites for instance would need a different measure (of SEO) than an informational site. Especially for content. I guess that Google uses measures to decide when a site is “product focused” and hence applies a slightly different ranking algorithm.
For instance normally duplicate content would be filtered from the search results but for e-commerce sites duplicate content would equate to competition which would equate to lower customer prices. Not something Google would want to filter out.
So in summary, what do you need to be doing.
To be fair to Google it’s pretty much what they have been saying for a long time but now Google are better at detecting and penalizing those that do not step up to the mark.
- Create good quality, unique, useful and relevant content.
- Add fresh content regularly building authority and age.
- Focus your site pages to one subject and make it clear to Google what your page is about.
- Do not over optimize I.e. keyword stuff, mass link with poor quality links.
- Make your backlinks and anchors varied and “natural”. Include inner pages not just home pages.
- Ensure social exposure across many networks – but naturally of course!
- Don’t overload your site with ads, especially above the fold.
It’s all about quality and variety. Quality content, natural links and a great visitor experience.