View Single Post
Old 10-13-2010, 11:55 AM  
RycEric
Confirmed User
 
RycEric's Avatar
 
Industry Role:
Join Date: Apr 2009
Posts: 1,313
Quote:
Originally Posted by mkx View Post
I am looking to protect my websites data and images against spiders, bots, data extraction etc. I figure this can be done by limiting the number of requests and/or data transfer an IP can make in a certain amount of time. I do not however want to block google or other major search engines from spidering my site which I guess makes things more complicated. Does anyone know of any such script or method?
This can be superceded by using bots with proxy databases. Just an FYI.
RycEric is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote