|  | 
| 
 How Do You Stop Fucking Baido Indexing Sites?.. It ignores robots.txt - Anyone know what IPs it uses?.... | 
| 
 What do you have against Scott Baio? | 
| 
 Don't you like all that CN traffic? | 
| 
 | 
| 
 fucking racist!! :disgust :1orglaugh:1orglaugh:1orglaugh | 
| 
 Just have host blacklist china traffic. | 
| 
 robots.txt is like asking your neighbor to keep his dog off your lawn, htaccess is like installing an electric fence :2 cents: | 
| 
 Quote: 
 My own solution to the problem is to firewall any IP that presents a Baidu user-agent. | 
| 
 I would never turn down free traffic.  If you don't want china leeching your resources, redirect it somewhere useful :2 cents: | 
| 
 Quote: 
 If your site has a decent number of pages and/or it is dynamically generated then Baidu really is just wasting resources. | 
| 
 i might be interested in taking all chinese traffic. | 
| 
 If baidu spider teaffic is relevant for your server performance, maybe it's time to upgrade. | 
| 
 Just put this in your htaccess file in the root of each domain: # Block bad spiders RewriteEngine On RewriteCond %{HTTP_USER_AGENT} Sosospider [NC,OR] RewriteCond %{HTTP_USER_AGENT} Baiduspider [NC,OR] RewriteCond %{HTTP_USER_AGENT} Sogou RewriteRule ^.* - [F,L] You can add as many as you want , make sure the lines ends with [NC,OR] and with nothing for the last one Using it for years and been tested ... | 
| 
 Quote: 
 The main reason is that it inflates my traffic to sponsors so much its hard to see what the real conversions are - If you are as into stats as me it is a real pain in the ass... I am moving towards using GeoIp scripts and sorting traffic that way but it takes time..... | 
| All times are GMT -7. The time now is 07:59 AM. | 
	Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
	
	©2000-, AI Media Network Inc123