Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 06-23-2004, 12:03 PM   #1
Dravyk
Confirmed User
 
Join Date: Jan 2001
Location: Yo, Philly!
Posts: 1,202
SE experts, need advice on pages with feeds

I have a site that uses RSS feeds generated via J avascript and other embedded execuatbles. As these pages are generated "within browser" -- much like an SSI does -- a look through a few search engine simulators suggests that SE spiders would find mostly blank pages there. While a browser would see constantly-updating content.

Anyone know of a fix or a script that will turn these into "static" pages that SEs will spider?
__________________
All Of 'Em
Dravyk is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-23-2004, 12:05 PM   #2
WiredGuy
Pounding Googlebot
 
Industry Role:
Join Date: Aug 2002
Location: Canada
Posts: 34,482
Quote:
Originally posted by Dravyk
I have a site that uses RSS feeds generated via J avascript and other embedded execuatbles. As these pages are generated "within browser" -- much like an SSI does -- a look through a few search engine simulators suggests that SE spiders would find mostly blank pages there. While a browser would see constantly-updating content.

Anyone know of a fix or a script that will turn these into "static" pages that SEs will spider?

Here's an example: http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.

WG
__________________
I play with Google.
WiredGuy is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-23-2004, 12:20 PM   #3
Basic_man
Programming King Pin
 
Basic_man's Avatar
 
Industry Role:
Join Date: Oct 2003
Location: Montreal
Posts: 27,360
Quote:
Originally posted by WiredGuy
Here's an example: http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.

WG
Wow, nice explanation!
__________________
UUGallery Builder - automated photo/video gallery plugin for Wordpress!
Stop looking! Checkout Naked Hosting, online since 1999 !
Basic_man is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-23-2004, 12:30 PM   #4
Dravyk
Confirmed User
 
Join Date: Jan 2001
Location: Yo, Philly!
Posts: 1,202
Very nice, WG!!

I had both an RSS converter built awhile back for me, as well as a site cacher for when those feeds sometimes have Net congestion probs and hold up the pageloads.

Looks like (which is what I pretty much figured) I'll also need to get another one made up to do the static thing. Glad to see it's possible and it exists. Again, many thanks.
__________________
All Of 'Em
Dravyk is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-23-2004, 12:34 PM   #5
WiredGuy
Pounding Googlebot
 
Industry Role:
Join Date: Aug 2002
Location: Canada
Posts: 34,482
Quote:
Originally posted by Dravyk
Very nice, WG!!

I had both an RSS converter built awhile back for me, as well as a site cacher for when those feeds sometimes have Net congestion probs and hold up the pageloads.

Looks like (which is what I pretty much figured) I'll also need to get another one made up to do the static thing. Glad to see it's possible and it exists. Again, many thanks.
This also cuts back on load time and bandwidth significantly. If you're using XML feeds on the fly, you download the data, parse it and spit out the results in realtime. Using the method above, the load time is static since your just opening a file and you eliminate constantly requesting the feed. I just refresh the data every 4 hours. So from a couple thousand times per day to 6 data loads.

WG
__________________
I play with Google.
WiredGuy is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-25-2004, 09:24 PM   #6
Namzo
Registered User
 
Join Date: Jun 2004
Location: Vegas baby
Posts: 83
Quote:
Originally posted by WiredGuy
Here's an example: http://www.submitcorner.com/
The feeds on the right side of the screen under latest headlines come from Moreover. The way I do it, is I get their data through an XML feed and have a script create the HTML code that will be used on the right panel. The html code is then saved into an SSI include and loaded on runtime. This way, browsers and bots will see the same text and nobody needs to know its an xml feed.

WG
Is this technically considered leeching content by some feed providers?
Namzo is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-25-2004, 09:30 PM   #7
Webby
Too lazy to set a custom title
 
Join Date: Oct 2002
Location: Far far away - as possible
Posts: 14,956
WiredGuy

Nice idea!!
Webby is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-25-2004, 09:38 PM   #8
WiredGuy
Pounding Googlebot
 
Industry Role:
Join Date: Aug 2002
Location: Canada
Posts: 34,482
Quote:
Originally posted by Namzo
Is this technically considered leeching content by some feed providers?
If you're licensing the content it's perfectly fine. In fact, if you're doing the implementation I suggested above, you'd be using extremely little bandwidth from the feed provider since you only request it a couple times per day versus per every surfer.

WG
__________________
I play with Google.
WiredGuy is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-25-2004, 10:40 PM   #9
smack
Push Porn Like Weight.
 
smack's Avatar
 
Industry Role:
Join Date: Mar 2002
Location: Inside .NET
Posts: 10,652
thanks for the tip WG.

*bump for some good info*
__________________
Cry havoc and let slip the dogs of war.
smack is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-25-2004, 10:44 PM   #10
Abyss_Vee
Confirmed User
 
Abyss_Vee's Avatar
 
Join Date: Sep 2003
Location: Los Angeles
Posts: 5,208
good stuff wg
Abyss_Vee is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 06-25-2004, 11:09 PM   #11
Namzo
Registered User
 
Join Date: Jun 2004
Location: Vegas baby
Posts: 83
Quote:
Originally posted by WiredGuy
If you're licensing the content it's perfectly fine. In fact, if you're doing the implementation I suggested above, you'd be using extremely little bandwidth from the feed provider since you only request it a couple times per day versus per every surfer.

WG
So do you think doing it with a free feed is kinda a gray area and possibly ok, especially if you are using not using noticible bandwidth from them depending on each provider's TOS or a definite no?

Thanks
Namzo is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.