Comment on Cloak Affiliate Links Tutorial by SEO Dave.

Link Cloaking PHP Script The short answer is cloak any links you don’t want to send SEO benefit to.

For me that includes all affiliate links.

Does not include any links to my own sites (unless there’s a reason I don’t want two of my sites linked together).

Does not include links generally.

I wouldn’t worry about the Googlebot leaving your site, even if your site had no links off the domain when it’s spidered enough pages it will leave anyway and will be back at regular intervals: more incoming links you have from other indexed pages more often you’ll be spidered.

How many pages a bot will spider at a particular time seems random, the links it follows are random, so it won’t hit your home page and follow every link on the page until it runs out. Bots act like a random link clicker, hit page A and randomly click a link, over time all links are clicked.

This is why there’s an SEO Myth Google won’t follow more than 100 links from a page, the reality is Google advises not to have more than 100 links from a page because each link (on page with 100 links) has a 1 in 100 chance of being followed every time a spider hits a page.

If you have a page with 1,000 links it’s going to take a LONG time for all 1,000 links to be followed randomly and if on that page there’s a link to an important page you really want indexing and it has links from no or few other pages it might never be indexed.

Best advice on the number of links per page is it depends on the site and how many other sites link into that site (and how popular the pages are the incoming links are from). More incoming links you have more often a bot will find your site and more links it will follow within your site.

You can submit an XML sitemap to Google, but I don’t see the point, if Google can’t find the pages of your site naturally via incoming links it isn’t going to rank them well because you’ve added an XML sitemap of everything. I personally don’t want Google to find a page via an XML sitemap, if I think I’ve done everything right and a page isn’t indexed through random spidering I want to know, having it indexed via an XML sitemap means I might not realise there’s a problem. If an important page isn’t being spidered regularly it’s easily solved (when I know there’s an issue), add more links directly to that page. Remember getting a page indexed is a waste of time if it doesn’t also generate traffic.

Well, that comment went all over the place :-)