![]() |
Why does Google decide Robots.txt no longer works?
On a shared server you have no robots.txt for your sites, Google has always been happy, then one day Google decides it's no longer happy with any of the sites on that server, it claims it can't access the robots.txt for them. I see it expects 404 if there's no robots.txt but my server gives Error 500, according to forums that causes a problem.
Has anyone else had this type of problem, it should be easy enough to fix, but anyone know what likely caused the change? Change of settings on the shared server? Cheers :thumbsup |
When did robots.txt become a server issue? Just add it in your public_html folder
|
Quote:
|
if a script your running generates a 500 error on a not found robots.txt google wont spider the site.
i have had this happen to me before. |
Googlebot is probably assuming (reasonably) that if a fetch attempt returns a 500, your site has problems.
I'd be more concerned about why your server is generating a 500 internal server error for a simple 404 file not found. |
Quote:
|
Quote:
|
never question GOD!
|
|
Blame the Illuminate.
|
500 is your issue, you have to solve it, either give him 404 or 200
if google is asking for it then give it one. create a blank robots.txt i done this before. or the bot starts hitting it and it floods the fucking error logs. |
Quote:
Added blank robots, will see what Mr G thinks :2 cents: |
google fucking sucks.
|
Quote:
|
Quote:
|
Your thread title is backwards.
|
All times are GMT -7. The time now is 02:28 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123