So the general consencus is we should create some form of site map... either xml or txt... and then put within the robots.txt a command to make sure that the bots read/crawl the sitemap
Anyone add anything to keep the bad ones out... or is that done mainly via the htpaccess file ? either way is there a good example ? For either... that shows ones that should be included ?
