![]() |
Question for SEO Gurus out there
Hey all... question for the SEO Gurus:
If you have a dynamic link that does not exist all the time, how do you stop Engines from indexing it or validators from flagging it as invalid? For example, I have a link that looks like this: /Directory/SubDirectory/ProgramName.cgi/content Sometimes this content is online and sometimes it is not. I have tried inserting rel="nofollow" into the link, but validators still see this as 404 error. What am I missing? |
Use .htaccess to omit.
|
put this into your robots.txt
User-agent: * Disallow: /Directory/SubDirectory/ or User-agent: * Disallow: /Directory/SubDirectory/ProgramName.cgi |
Quote:
|
|
Quote:
Hi Muad'Dib, Thanks for the info. For now, page validators keep showing the pages I don't want indexed anyway, so I'm thinking that only htaccess will work in this case. I'll use your suggestion too just to be sure. Quote:
I already have rel="nofollow" in the links to those pages. Do you think I should use rel="no index" too? |
Quote:
You are basically giving (allowing) the address to page validators to go to and check. But when SE bots come to your site, they read the instructions first and don't go to those directories, because it's forbidden. |
Quote:
Seo Guru hit me up sometime! |
u want this between head tags on that page
<meta name="robots" content="noindex"> |
rel="nofollow" has nothing to do with what youre trying to do
|
All times are GMT -7. The time now is 10:19 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123