Quote:
Originally Posted by Tempest
I hear ya... Google is partly to blame.. They could very easily flag those types of sites for aditional crawling of external javascript files etc. but they choose not to for some reason.
Actually, there is a potential option but it's not 100%. You set up a bot trap on your site. It needs to be a more advanced type because these guys are getting more brazen and are setting up their crawlers to pretend to be googlebot, slurp etc.
|
Yeah I don't want to stop valid bots from crawling, just the scrapers... sigh.