GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Sitemap submission for large sites? (https://gfy.com/showthread.php?t=995506)

Angry Jew Cat - Banned for Life 11-01-2010 09:37 PM

Sitemap submission for large sites?
 
So I'm working on a project right now, and it's looking like the end product will have around 90,000 pages. Obviously I want as many of these as I can to get indexed. Up until now I've never built anything out to any more than 4-5000 pages, and have honestly never seen the need to submit a Sitemap to Google, and felt my navigation was adequate enough to provide the spiders with ample opportunity for the spiders to crawl everything. I'm seeing how such a large site could pose some issues allowing for easy navigation to all pages.I'm curious to those who have built out large sites like this what your opinions on the matter are. Is submitting a Sitemap in this situation the recommended course of action? Or should I rely on the usability of my linking structure to let the spiders work naturally?

comeplay 11-01-2010 09:49 PM

Curious any thoughts as well

jonnydoe 11-01-2010 10:08 PM

try a sitemap index for that scale

Angry Jew Cat - Banned for Life 11-02-2010 03:34 AM

bumping!

hdkiller 11-02-2010 03:53 AM

I know people who managed to submit 100K+ links via sitemaps. You just have to splice it and make a sitemap index. Google have no problem with it.

Tempest 11-02-2010 01:30 PM

Quote:

Originally Posted by hdkiller (Post 17662313)
I know people who managed to submit 100K+ links via sitemaps. You just have to splice it and make a sitemap index. Google have no problem with it.

And have a compressed version available.

anexsia 11-02-2010 01:48 PM

I see no harm in doing so? I submit sitemaps to google for all of the websites I create and always seem to get indexed well.

_Richard_ 11-02-2010 03:17 PM

bump for answers

Rankings 11-02-2010 04:30 PM

I suggest breaking your sitemaps up into 2k page xmls:

site.com/sitemap1.xml
site.com/sitemap2.xml

submit each sitemap and add them into your webmaster tools. There is no limit to the amount you can apply to your site.

Example below:

<?xml version="1.0" encoding="UTF-8" ?>
- <sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/siteindex.xsd">
- <sitemap>
<loc>http://www.YOUSITE.com/sitemap1.xml</loc>
<lastmod>2010-02-22T17:20:55+00:00</lastmod>
</sitemap>
- <sitemap>
<loc>http://www.YOUSITE.com/sitemap2.xml</loc>
<lastmod>2010-02-22T17:20:55+00:00</lastmod>
</sitemap>

NoWhErE 11-02-2010 04:54 PM

Take all the links from the sitemap, dump them into an RSS feed creator and submit to rss feeds.

That'll get all your shit indexed fast.

AzteK 11-02-2010 04:59 PM

tar ball or split

MrRob 11-02-2010 05:04 PM

90,000 pages ey? Sounds like a fine product! :thumb:

"How can I add more than 1,000,000 URLs to a Sitemap?"
https://youtube.com/watch?v=OshuVtzh14I

PAR 11-02-2010 06:32 PM

According to the Google Sitemap FAQ, your sitemap can contain up to 50,000 URLs or reach a file size of 10MB (uncompressed!). However, I would recommend you to split such large sitemaps into various smaller ones which allows Google to retrieve only the latest ones regularly. This will save you a lot of traffic

Angry Jew Cat - Banned for Life 11-02-2010 09:59 PM

Quote:

Originally Posted by NoWhErE (Post 17665325)
Take all the links from the sitemap, dump them into an RSS feed creator and submit to rss feeds.

That'll get all your shit indexed fast.

That's not a bad sounding idea. What would you recommend as an RSS feed builder?

Davy 11-03-2010 01:31 AM

If everything is linked, there is no need for a sitemap.
And since pages that are not linked have a high likelihood of dropping out of the index, there is not much reason to create a sitemap.

If you want, though, I can give you a python script that automatically parses the files on your site.
Or google for it - it was mentioned by Google themselves, though I don't know if they still offer it.

Angry Jew Cat - Banned for Life 11-03-2010 01:44 AM

Quote:

Originally Posted by Davy (Post 17666239)
If everything is linked, there is no need for a sitemap.
And since pages that are not linked have a high likelihood of dropping out of the index, there is not much reason to create a sitemap.

If you want, though, I can give you a python script that automatically parses the files on your site.
Or google for it - it was mentioned by Google themselves, though I don't know if they still offer it.

That's the mentality I've always rode things out with and I always seemed to get things indexed. If I have a good navigational structure and do my share of deeplinking, everything should be found just fine. Categories are well sorted, posts and pages are tagged, everything has a good flow to it. I'm just curious if allowing Google to have a Sitemap ahead of time will accelerate or improve the process. I'm having a feeling I'll generate all these pages, populate them, and wind up with only a third of them getting indexed if I don't help things along. I've just never dealt with a project spanning so many pages, so I don't really know.

HomerSimpson 11-03-2010 11:41 AM

first you make sitemaps split by category, date or by some other thing (up to 50k links), and gzip it

then you make main sitemap

if you need any assistance hit me up
(icq in sig)


All times are GMT -7. The time now is 03:38 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123