A robots.txt is a file placed on your web server (root directory) to tell the various search engines not to crawl or index certain sections or pages of your site. You can use it to prevent indexing totally, prevent certain areas of your site from being crawled or to issue individual indexing instructions to specific search engines.
There are a number of reasons that you may wish to stop search engines from indexing some or all of your site:
i) You are still building the site, or certain pages, and do not want the unfinished work to appear in search engines
ii) You have information that, while not sensitive enough to bother password protecting, is of no interest to anyone but those it is intended for and you would prefer it did not appear in search engines.
Here is a quick online source that builds your robots.txt file with just a few clicks of a web form. hwww.mcanerin.com/en/search-engine/robots-txt.asp
Web Design Courses Brighton UK