![]() |
noindex meta tag - delayed effect?
Hello,
some time ago I have decided to remove tag and pagination (categories) pages from SERPS: Code:
<meta name="robots" content="noindex,follow" /> Any idea why it takes so long? |
It depends; I have experienced about one week delay when changing titles, etc., but if you wish Google to remove some of your pages from SERPS, it may not happen altogether. For example, I accidentally did let Google to index my duplicate test site; Google started to direct visitors into there too and I got a fucking mess dealing with it. All this despite the fact that the site was maybe a day or so for Google to index until I forbid it, but Google didn't listen. So, I did what I should have done at the first place; I did put the whole site under password.
So, long story to short; Google may not ever obey your wish. As it is a wish. |
Quote:
Advice for others: use noindex for tag / category pagination right from the start. Those pages don't have unique pages anyway, they will be treated as duplicate content. |
All times are GMT -7. The time now is 04:45 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123