![]() |
I have a cool idea. Take it and run with it.
Use distributed computing to become the crawler.
For example, google.. rather than having their servers do the crawling and declare themselves as googlebot they should use the google toolbar to take computers that are not in use and let them do the crawling. Google would get: 1) Unpredictable ip's. No SERP spammers could predict googlebot's ips and serve up different content for them. 2) Well.. thats pretty much it. The new googlebot distributed network could also lie about it's browser version and referring url's so as to force sites that cloak to always divulge their real content. As a result results would be more relevant. Just a thought.. stupid? to naive? |
I believe I read on the google blog this, or something like this, is in the works.
|
well be fucked if i'm going to let a company make a profit off my power bill , my computer resources
if they wanted to use my cycles for such a thing , they would have to pay me |
Alexa use there tool bar to track hits to sites. That is more of a passive use.
I don't think google is gonna pickup on this one.... |
I feel like i'm running with my hands empty... ;(
|
Quote:
|
by the way. this fucking skin sucks, only because it's broken.
|
i am too drunk to understand this thread :1orglaugh
|
The first application is already live: http://toolbar.google.com/dc/offerdc.html
Distributed systems of idle cpu time via the toolbar is already setup, they just have to set it up to crawl. As an SEO, I for one hope this never happens. WG |
Quote:
with my permission or without , i still wouldn't donate my cycles to a business that makes a huge ass profit every year |
It's called the google toolbar, and despite denials, they have been using it for years.
Alex |
| All times are GMT -7. The time now is 05:14 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123