Check this out:
https://www.google.com/webmasters/si.../protocol.html
Quote:
The Sitemap Protocol allows you to inform search engine crawlers about URLs on your Web sites that are available for crawling. A Sitemap consists of a list of URLs and may also contain additional information about those URLs, such as when they were last modified, how frequently they change, etc.
Sitemaps are particularly beneficial when users can not reach all areas of a Web site through a browseable interface ? i.e. users are unable to reach certain pages or regions of a site by following links. For example, any site where certain pages are only accessible via a search form would benefit from creating a Sitemap and submitting it to search engines.
This document describes the formats for Sitemap files and also explains where you should post your Sitemap files so that search engines can retrieve them.
Please note that the Sitemap Protocol supplements, but does not replace, the crawl-based mechanisms that search engines already use to discover URLs. By submitting a Sitemap (or Sitemaps) to a search engine, you will help that engine's crawlers to do a better job of crawling your site.
|
Excellent, a new wave of optimization begins. I'll exploit this to the fullest early, I just wonder how far it will get me. At least it will make it easier to feed google thousands of false directories created from a single php script
