Comment on WordPress SEO Tutorial by SEO Dave.

WordPress SEO Listened to the Podcast, lets see if I can link it direct in a comment for others to listen to:

Information from the PodCast:

All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.

Source file if the above fails : http://audio.sxsw.com/2012/podcasts/10-RAD-Dear_Google_and_Bing_Help_Me_Rank_Better.mp3

Matt Cutts suggested they are going to roll out an over SEO’d algorithm within weeks, but listen to what he said (highlighted above).

like too many keywords on a page – AKA keyword lists, spammy use of keywords etc… I always write articles with SEO in mind, I never check keyword density (waste of time), if it reads OK I keep it, if it reads SPAMMY I change it.

There’s also the very basic question what is “too many keywords”?

I think we can agree under almost all circumstances a 90% keyword density for a single keyword is way to high, but what about 20%, 10%, 5%, 2%?

What about a page with a large amount of text, would it be spammy to keep a density above 5% on a 1,000 word article, keyword used 50 times?

What about a 100 word article, keyword used 5 times?

Absolute keyword density is irrelevant, I very much doubt any major search engine looks at keyword density per se.

If you create content that is poorly SEO’d, check the keyword density of the SERPs the page is targeting and add keywords to match whatever keyword density you think is perfect you run the risk of going SPAMMY or under SEOing if you push the density down on a low content page.

Take this comment, I’ve given it the title “Matt Cutts – Google Over SEO Optimized Ranking Algorithm” which if this was a stand alone article would suggest I’m targeting keywords/SERPs:

Matt Cutts, Google, Over SEO Optimized, Ranking Algorithm

And various derivatives: I’m not targeting anything, couldn’t think of a good title that would generate traffic on the Stallion SEO Super Comments page :-)

Ignoring the few lines above I’ve used the phrase “Matt Cutts” only once, if I’m serious about gaining Matt Cutts SERPs I’d ideally need to do more. If this comment truly was about Matt Cutts (it isn’t really) I’d ideally use his name a heck of a lot more than once in the title and once in the body content. When writing with SEO in mind this isn’t difficult, where I wrote “use his name” above changing to “use Matt Cutts name” would add another instance of the phrase without being SPAMMY. However, most webmasters don’t write this way and those who are lazy will use grey/blackhat SEO techniques like adding keyword lists at the base of the article, if I added a keyword list of Matt Cutts relevant phrase like this:

Matt Cutts duplicate content, Matt Cutts 30 days, Matt Cutts sopa etc...

This would clearly be SEO SPAM and if I over did it enough to push a poorly SEO’d article to have it’s keyword density for Matt Cutts relatively high I’m confident it wouldn’t pass a manual review. Currently I don’t think Google have a ranking algorithm, but maybe this is what they are working on (would make sense).

Note how I used Ranking above instead of just Algorithm, giving us “Ranking Algorithm” which is a match with the title without having to use blackhat SEO techniques.

Stallion 6.2 won’t be hit by this sort of Google algorithm change, Stallion doesn’t do keyword lists or anything like that.

exchange way too many links – AKA excessive link building, buying links. For years Google has been improving how it deals with SEO link SPAM. It’s difficult to determine exactly where they are right now. What’s clear is excessive link building like adding thousands of links with the same keyword rich anchor text and buying links is very risky, where that SEO grey line is I don’t know so tend on the side of caution.

For a domain you don’t want to risk, never buy links that pass link benefit. Never use link SPAM like adding SPAMMY comments to blogs, forums etc… a SPAMMY link would be where a webmaster creates thousands of comments etc… on blogs and forums just for the links, the links will tend to have the same anchor text and be added in an automated way. Commenting on a blog and adding a link to your author name isn’t SPAM, not that on most blogs it matters, they tend to be nofollow links.

The grey area is link exchanges and other ways to generate backlinks. I own 100 domains, would be dead easy to add a link farm to all my domains inter linking them all together, or I could add a sitewide link from all 100 domains to this sites home page with the anchor text “WordPress SEO Theme” and if it wasn’t gaming the search engines would probably push it to number 1 in Google within a year of adding the links (links don’t pass full SEO benefit for about a year).

This domain is only one year old and is in a niche that is hard to generate natural links for, so adding 100 domains worth of sitewide links would obviously be excessive link building and easy for Google to spot via it’s ranking algo. I tested this years ago and the domain was penalized, since we can guarantee the algo is better today we have to be careful linking our own sites together. Instead I add links from my other relevant sites and a few others (less is more).

Reciprocal links pretty much cancel themselves out and I see no evidence Google is cracking down on them. Webmasters generally don’t trade PR3 links for PR7, tends to be same PR roughly. If I send you a PR4 link and you send me a PR4 link overall there’s very little link benefit (PR) passed between the two sites. One domain will gain a little more PR than the other, but in the scheme of things it’s irrelevant. The benefit of having a link from another domain is important, but again with reciprocal linking the amount passed by a single link is small. To make a real difference a site needs hundreds of links, I’ve been working as an SEO for around a decade and haven’t setup hundreds of reciprocal links, they are hard work! I bet you haven’t been able to setup 50 reciprocal links for one domain?

Google does’t have to penalize sites for reciprocal linking.

I expect when Google rolls out the new Over SEO optimized ranking algorithm it will have little impact on my network of sites. I’ve never had a Google algorithm update have a major negative impact on my network. Some sites go up, some go down. but nothing drastic in the down direction. Had a few algo updates increase traffic overall.

David