|
I have a cool idea. Take it and run with it.
Use distributed computing to become the crawler.
For example, google.. rather than having their servers do the crawling and declare themselves as googlebot they should use the google toolbar to take computers that are not in use and let them do the crawling.
Google would get:
1) Unpredictable ip's. No SERP spammers could predict googlebot's ips and serve up different content for them.
2) Well.. thats pretty much it. The new googlebot distributed network could also lie about it's browser version and referring url's so as to force sites that cloak to always divulge their real content.
As a result results would be more relevant.
Just a thought.. stupid? to naive?
__________________
$50 FREE TRIALS! Every Day til 2008!!!
Only at QuickBuck
|