View Single Post
Old 05-16-2006, 09:13 PM  
Quick Buck
Confirmed User
 
Join Date: Feb 2006
Location: Free-Trials.......... Weekly-Payouts..... 100+Sites
Posts: 1,026
I have a cool idea. Take it and run with it.

Use distributed computing to become the crawler.

For example, google.. rather than having their servers do the crawling and declare themselves as googlebot they should use the google toolbar to take computers that are not in use and let them do the crawling.

Google would get:
1) Unpredictable ip's. No SERP spammers could predict googlebot's ips and serve up different content for them.
2) Well.. thats pretty much it. The new googlebot distributed network could also lie about it's browser version and referring url's so as to force sites that cloak to always divulge their real content.

As a result results would be more relevant.

Just a thought.. stupid? to naive?
__________________
$50 FREE TRIALS! Every Day til 2008!!!
Only at QuickBuck

Quick Buck is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote