![]() |
BIZ - Here is an SEO tip for the non-seo expert
Include a robots.txt File at the root of your site.
Open up a text editor, type User-agent: *. Then save the file as robots.txt and upload it to your root directory on your domain. This one command will tell the spiders that hit your site to crawl every page. The search engine analyzes everything it indexes to determine what your website is all about, it is a good idea to block certain folders and files that have nothing to do with the content you want analyzed. You can disallow unrelated files to be read by adding "Disallow: /folder_name/" or "Disallow: /filename.html" User-Agent: * Disallow: /cgi-bin/ Disallow: /img/ :2 cents: |
Good tip, some people who know about it don't even use it. Myself included, should though.. . It is the first thing google bot looks for every time.
|
Why would you tell the SE's to ignore your images? You'll lose out on all image search traffic from yahoo/google.
WG |
Quote:
|
Quote:
|
There is no reason using robots.txt on images. Use it only on folders you really don't like indexed, for example: admin panels, user submission forms, protected folders, and so on. By default spiders will craw whole domain without problems, so no need to include robots.txt, if you don't have intentions to blocking or allowing only certain parts of your site.
|
thanks!!!!
and happy bday! |
Quote:
|
yea what he said
|
Thank you for the advice. Much appreciated.
|
All times are GMT -7. The time now is 12:45 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123