View Single Post
Old 02-22-2010, 11:40 AM  
raymor
Confirmed User
 
Join Date: Oct 2002
Posts: 3,745
Quote:
Originally Posted by rowan View Post
My site has 200 million pages so technically googlebot isn't fetching fast enough... at the rate of 120k fetches per day it would take 4 1/2 years to index everything. At this point the benefit of indexing 100% of the site (or at least as much as it's trying to) isn't worth the load it's placing on the server.
200 MILLION pages? I'm curious, is that 200 million legitimate pages, or 200 million
pieces of fake SE spam crap? If you tried to spam the crap out of Google by creating
200 million bogus pages, I'd say you got what you deserved, and really what you
asked for. If you pretended to have 200 million pages so that Google would spider
you 200 million times, that was your decision. You can't blame Google if you chose to
create fake stuff for them to spider.

Note the repeated use of "IF" - I'm asking IF that's what you did.
__________________
For historical display only. This information is not current:
support@bettercgi.com ICQ 7208627
Strongbox - The next generation in site security
Throttlebox - The next generation in bandwidth control
Clonebox - Backup and disaster recovery on steroids
raymor is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote