I’ve been building web sites and generating masses of free Google search engine traffic for over 10 years, in that time I’ve built the best WordPress SEO Theme available online and made tons of money, yet I’ve never generated or used a Google XML Sitemap.
When I decided to write this part of the WordPress SEO tutorial since I’ve never used an XML sitemap and have generated tens of millions of search engine visits to my own sites I was entrenched in my SEO view Google XML sitemaps are not only not required, they are a hindrance to SEO analysis.
After researching what others think about XML sitemaps I’m not so sure?
Rand Fishkin for example advises using XML sitemaps: Rand Fishkin is the founder of SEOMoz (now just Moz), he does tend to be on the ball SEO wise: there’s not many of them, most so called SEO experts don’t have a clue :-(
Why I Do NOT Generate a Google XML Sitemap
For as long as I can remember Google found and ranked web pages based on links, generally speaking if Google couldn’t find a web page via a backlink (internal or incoming) it was pointless submitting the URL to Google via it’s free submission service or via an XML sitemap since it won’t generate traffic: the goal of good SEM isn’t to get a million pages spidered by Googlebot and indexed in Google, it’s to persuade Google to send a million visitors to your site. Indexed pages is not the same as search engine visitors.
All my current websites were either built manually by me or are running under WordPress with the WordPress SEO theme (Stallion Responsive) I develop. I know precisely how Googlebot will find my sites and spider through the sites link structure. There are no URLs on any of my sites that I want indexing in Google that Googlebot can’t easily find without a sitemap. So for my sites I don’t need to submit a Google XML sitemap for Google to FIND a particular URL.
That in itself is not a good reason for not creating and submitting an XML sitemap.
My good reason is I want to know if an important part of a site isn’t indexed in Google because of a lack of link benefit flowing to that section of the site (Google can find the URL through links, but there isn’t enough links for Googlebot to regularly visit) or for another reason and using sitemaps just to get a URL indexed could distract from seeing backlink problems etc…
Basically if there isn’t enough backlinks to keep Googlebot spidering an important URL on a regular basis I want to know sooner rather than later so I can do something about it. If an entire site is regular spidered just because there’s a Google XML sitemap I’ve lost an important diagnostic SEO tool.
In my SEO experience there’s no long term benefit in Googlebot finding a URL quickly UNLESS the information is time sensitive.
If I were finding a URL I wanted indexing and ranking tended to not be spidered regularly I could add more links. Adding an XML sitemap would mean I’d loose this valuable SEO data.
Do XML Sitemaps Increase Google Traffic?
This is an important SEO question and one I’ll now need to test :-) so right now I don’t know for sure, but have seen XML sitemaps in the same category as search engine submissions, waste of time, yes Google will sends it’s spider to all pages of a site, but will posts Googlebot couldn’t find via links actually generate traffic?
I suspect not, but will have to devise a fair SEO test since none of my sites (over 130 domains worth) have inaccessible content I want indexed.
Google XML Sitemaps SEO Test
Thought of an SEO test, first some background on Stallion SEO Super Comments.
As I write this it’s 8th March 2014, although I’ve owned this domain for well over a year the first content was added in December 2013 (couple of posts), but the bulk of the content was added early February 2014, so most of this site went live over the past few weeks. Currently there’s around 30 WordPress posts and a few static pages, so a small site that is going to be easily spidered by Googlebot (easy to check if less than 50 pages are indexed).
Many of the posts on this site are not new, I moved them from other SEO relevant sites I own (4 or 5 of the SEO Tutorial articles for example) to consolidate WordPress SEO relevant content on one domain (this one).
The moved posts came with over 300 comments and since I use the SEO Super Comments Stallion feature a lot of them have been spidered and indexed by Googlebot. You’ll note the discrepancy between ever crawled (over 1,000) and indexed (364), some of that is because I also moved comments around between posts (used move comments WordPress plugins), so for a time Google would have found the same comments on two posts (Google will figure out they’ve been moved eventually :-)).
Use the site:domain search in Google to see what’s indexed on a domain:
Right now Google has indexed 422 pages from this site, the number of pages isn’t always accurate (note the difference between a Google search and Webmaster Tools Index Status), some searches I see over 1,000 pages indexed which is not true. Still it’s a good guide for when you don’t own the domain, but want to see how much is indexed.
This means I have over 300 of the comments indexed in their own right as Stallion SEO Super Comments, although not a perfect way to view them all you can get an idea with this Google site search:
This is searching this domain and looking for pages that include “?cid=” which is part of the URL of all Stallion SEO Super Comments. You can use this search format to look for pages indexed that cover specific keyphrases, so see how well you are covering a niche, try “Panda SEO” (88 results, covered well), “Hummingbird SEO” (1 result, need to work on that niche :-)).
It’s not perfect since this article will also show up (when indexed) since I’ve included “?cid=” in this body text.
I use the Stallion SEO Super Comments feature to get content indexed that might not warrant a post, with a comment you can be much less formal (heck of a lot quicker to run off a comment than a post) and of course my visitors comment and by changing the comments titles (another Stallion feature) I can target their comments to relevant SERPs. For example in the screenshot above you can see a comment indexed with the title:
WordPress SEO Tutorial : Nofollow DELETES Link Benefit
Read the comment Nofollow DELETES Link Benefit and you can see it’s informal, a really quick comment (quick for me anyway) that would have taken me 5-10 minutes to write. I wouldn’t write an SEO tutorial article just to make that one point (the XML sitemap article will take a couple of hours to write not including research), but by using the comments that are indexed I have long tail top 10 Google SERPs like:
Nofollow DELETES Link Benefit
Nofollow Link Benefit
I have hundreds of long tail SERPs like these for super SEO comments.
What Happened to the Google XML Sitemap SEO Test?
Now you know how the Stallion SEO Super Comments work you’ll note only largish comments generate a link to the super comment. If a comment is short, on this site I have it set to only add the link for comments with 400 or more characters in the comment body text the link isn’t added. So we have some shorter comments that do generate an SEO Super Comment (the URL exists for all comments, even single word comments), but they are NOT linked to from anywhere: I don’t want to waste link benefit on a “Great Article” comment.
My XML sitemap SEO test will involve finding a way to generate a Google XML Sitemap including links to all SEO Super Comments including the short ones (they’ll be perfect for this SEO test), create some short comments targeting specific long tail keyword SERPs and see if ONLY having them found via an XML sitemap generates Google traffic.
That’s going to be fun to code :-(
In my SEO opinion if hard to find content (no backlinks, internal or otherwise) doesn’t generate search engine traffic there’s no point Google spidering and indexing it.
Will post the sitemap SEO test results when I have them (no time frame).
XML Sitemap SEO Test Update
Only took an hour or so to create a modified version of the Google XML Sitemaps WordPress Plugin (used V3.4) that outputs all Stallion SEO Super Comments URLs whether the comment is large or small. So on 8th March 2014 submitted an XML sitemap to Google.
XML sitemap URL: http://stallion-theme.co.uk/sitemap.xml
Currently there are 402 URLs, the URLs including “?cid=” are the URLs of interest, though not all lack internal links, so only the smaller comments aren’t indexed already. Will be interesting to see if the smaller comments rank for any of the SERPs their titles are targeted at: even though they are small comments I had already given all comments on this site an SEO’d comment title.