![]() |
Sitemap submission for large sites?
So I'm working on a project right now, and it's looking like the end product will have around 90,000 pages. Obviously I want as many of these as I can to get indexed. Up until now I've never built anything out to any more than 4-5000 pages, and have honestly never seen the need to submit a Sitemap to Google, and felt my navigation was adequate enough to provide the spiders with ample opportunity for the spiders to crawl everything. I'm seeing how such a large site could pose some issues allowing for easy navigation to all pages.I'm curious to those who have built out large sites like this what your opinions on the matter are. Is submitting a Sitemap in this situation the recommended course of action? Or should I rely on the usability of my linking structure to let the spiders work naturally?
|
Curious any thoughts as well
|
try a sitemap index for that scale
|
bumping!
|
I know people who managed to submit 100K+ links via sitemaps. You just have to splice it and make a sitemap index. Google have no problem with it.
|
Quote:
|
I see no harm in doing so? I submit sitemaps to google for all of the websites I create and always seem to get indexed well.
|
bump for answers
|
I suggest breaking your sitemaps up into 2k page xmls:
site.com/sitemap1.xml site.com/sitemap2.xml submit each sitemap and add them into your webmaster tools. There is no limit to the amount you can apply to your site. Example below: <?xml version="1.0" encoding="UTF-8" ?> - <sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/siteindex.xsd"> - <sitemap> <loc>http://www.YOUSITE.com/sitemap1.xml</loc> <lastmod>2010-02-22T17:20:55+00:00</lastmod> </sitemap> - <sitemap> <loc>http://www.YOUSITE.com/sitemap2.xml</loc> <lastmod>2010-02-22T17:20:55+00:00</lastmod> </sitemap> |
Take all the links from the sitemap, dump them into an RSS feed creator and submit to rss feeds.
That'll get all your shit indexed fast. |
tar ball or split
|
90,000 pages ey? Sounds like a fine product! :thumb:
"How can I add more than 1,000,000 URLs to a Sitemap?" https://youtube.com/watch?v=OshuVtzh14I |
According to the Google Sitemap FAQ, your sitemap can contain up to 50,000 URLs or reach a file size of 10MB (uncompressed!). However, I would recommend you to split such large sitemaps into various smaller ones which allows Google to retrieve only the latest ones regularly. This will save you a lot of traffic
|
Quote:
|
If everything is linked, there is no need for a sitemap.
And since pages that are not linked have a high likelihood of dropping out of the index, there is not much reason to create a sitemap. If you want, though, I can give you a python script that automatically parses the files on your site. Or google for it - it was mentioned by Google themselves, though I don't know if they still offer it. |
Quote:
|
first you make sitemaps split by category, date or by some other thing (up to 50k links), and gzip it
then you make main sitemap if you need any assistance hit me up (icq in sig) |
All times are GMT -7. The time now is 03:38 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123