|
Hmm, I'd guess that it wouldn't have a positive effect on SERP rankings, but you might also consider to just add a line to your robots.txt to disallow googlebot to access the folder that's goint to be deleted. If you do that a few weeks before the pages should be delisted, and cause no more troubles therefore.
Or setup an 404 redirection that returns a 200 code, but that could trigger a few duplicate content filters.
Just an idea...
|