Comment on Cloak Affiliate Links Tutorial by SEO Dave.
Generally speaking most entries within a robots.txt file are a waste of time. The default action is allow spidering, telling a bot it can spider something it would have spidered anyway, is a waste of time.
Most directories and files you added to your robots.txt file is a waste of time and one is potentially damaging:
Do you use the default setup of having images under /wp-content/ ? If so you are blocking their spidering via
You allow spidering of the uploads folder to Googlebot, but what about Bing and other search engines?
If you ever add a stand alone file with extension .php, won’t be spidered.
Personally as a user with about 70 WordPress sites I don’t really use a robots.txt file, mine includes this:
Which slows down spidering for some bots (Googlebot ignores it). I’ve had problems with too many bots spidering the hell out of my sites and it causing resource issues on the server, if a few spiders slow their crawl rate might help a bit. Only use if you have an issue (I have a lot of links to my sites which means spiders pretty much take up permanent residence on my servers).
That’s the entire contents of 95% of my sites (100 domains) robots.txt files.
If some admin files are being indexed or you want other parts of a site not indexed without wasting link benefit see Stallion WordPress SEO Plugin.