What is a Robots.txt file?
How to use Robots.txt file, what is the use of this file..?
can any one help me out......
It basically tells spiders (like the Googlebot) what pages it can and cannot crawl. In other words, if you do not want certain parts of your website to be available via a Google search, then you can specify that in your robots.txt and Google will make sure it does not include those pages in its search database.
You can google up the format etc. All you need to do is create a file and drop it in the directory you want to control.
A robots.txt file will be present by default in your Google Webmasters. Check under Site Configuration -> Crawler access.
You can add specific rules using the Generate robots.txt tab and test it against different bots.
I wrote an article on"How a webpage appears to the GoogleBot".
This approach helps you in identifying those parts of your page which never get indexed by Google Bot and helps in taking decision whether to keep them or modify them so that they start getting indexed.