Quote:
Originally posted by Dravyk
I have a site that uses RSS feeds generated via J avascript and other embedded execuatbles. As these pages are generated "within browser" -- much like an SSI does -- a look through a few search engine simulators suggests that SE spiders would find mostly blank pages there. While a browser would see constantly-updating content.
Anyone know of a fix or a script that will turn these into "static" pages that SEs will spider?
|
Here's an example:
http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.
WG