![]() |
Robots.txt SE Question
Hey, we're using directories to list our sample images for each site we list, usually around 8 sample images per site, but some upwards of 16.
http://www.pornsumer.com/sites/8th-s...pics/picture4/ That's an example. So we have like picture1, 2, 3, to 8 usually per site. We expect to list thousands of sites using this system. Is this too many URL's for google? My feeling is that we should add them to the robots.txt, but I'm not sure the syntax we'd use considering the dynamic directory structure. Perhaps we should leave them out of the robots for some other reason? We could still leave this page in. This is an extreme example with 19 pics = 19 directories. http://www.pornsumer.com/sites/8th-s...nas/free-pics/ Thoughts? Comments? Syntax? :) |
some tips: add some text.. you won't get good ranks only with pics.
you should add them to the sitemap, not to robots.txt the site needs some serious work done... hit me on icq if interrested |
All times are GMT -7. The time now is 09:59 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123