GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   SE experts, need advice on pages with feeds (https://gfy.com/showthread.php?t=316724)

Dravyk 06-23-2004 12:03 PM

SE experts, need advice on pages with feeds
 
I have a site that uses RSS feeds generated via J avascript and other embedded execuatbles. As these pages are generated "within browser" -- much like an SSI does -- a look through a few search engine simulators suggests that SE spiders would find mostly blank pages there. While a browser would see constantly-updating content.

Anyone know of a fix or a script that will turn these into "static" pages that SEs will spider?

WiredGuy 06-23-2004 12:05 PM

Quote:

Originally posted by Dravyk
I have a site that uses RSS feeds generated via J avascript and other embedded execuatbles. As these pages are generated "within browser" -- much like an SSI does -- a look through a few search engine simulators suggests that SE spiders would find mostly blank pages there. While a browser would see constantly-updating content.

Anyone know of a fix or a script that will turn these into "static" pages that SEs will spider?


Here's an example: http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.

WG

Basic_man 06-23-2004 12:20 PM

Quote:

Originally posted by WiredGuy
Here's an example: http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.

WG

Wow, nice explanation!

Dravyk 06-23-2004 12:30 PM

Very nice, WG!!

I had both an RSS converter built awhile back for me, as well as a site cacher for when those feeds sometimes have Net congestion probs and hold up the pageloads.

Looks like (which is what I pretty much figured) I'll also need to get another one made up to do the static thing. Glad to see it's possible and it exists. Again, many thanks. :)

WiredGuy 06-23-2004 12:34 PM

Quote:

Originally posted by Dravyk
Very nice, WG!!

I had both an RSS converter built awhile back for me, as well as a site cacher for when those feeds sometimes have Net congestion probs and hold up the pageloads.

Looks like (which is what I pretty much figured) I'll also need to get another one made up to do the static thing. Glad to see it's possible and it exists. Again, many thanks. :)

This also cuts back on load time and bandwidth significantly. If you're using XML feeds on the fly, you download the data, parse it and spit out the results in realtime. Using the method above, the load time is static since your just opening a file and you eliminate constantly requesting the feed. I just refresh the data every 4 hours. So from a couple thousand times per day to 6 data loads.

WG

Namzo 06-25-2004 09:24 PM

Quote:

Originally posted by WiredGuy
Here's an example: http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.

WG

Is this technically considered leeching content by some feed providers?

Webby 06-25-2004 09:30 PM

WiredGuy

Nice idea!! :thumbsup

WiredGuy 06-25-2004 09:38 PM

Quote:

Originally posted by Namzo
Is this technically considered leeching content by some feed providers?
If you're licensing the content it's perfectly fine. In fact, if you're doing the implementation I suggested above, you'd be using extremely little bandwidth from the feed provider since you only request it a couple times per day versus per every surfer.

WG

smack 06-25-2004 10:40 PM

thanks for the tip WG. :thumbsup

*bump for some good info*

Abyss_Vee 06-25-2004 10:44 PM

good stuff wg :thumbsup

Namzo 06-25-2004 11:09 PM

Quote:

Originally posted by WiredGuy
If you're licensing the content it's perfectly fine. In fact, if you're doing the implementation I suggested above, you'd be using extremely little bandwidth from the feed provider since you only request it a couple times per day versus per every surfer.

WG

So do you think doing it with a free feed is kinda a gray area and possibly ok, especially if you are using not using noticible bandwidth from them depending on each provider's TOS or a definite no?

Thanks :)


All times are GMT -7. The time now is 01:47 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123