Robots.txt file is used to tell Google, which pages of the website are allowed to crawl and which pages should be avoided. This file is placed in the root file of the website. Their is alternative to Robotx.txt file as well, you can use robots tag in each pages of the website and write value as Index or No Index. If you are not mentioning anything in robots tag, than by default it is allowed tag.
The only difference between the Robots.txt file and Robots tag is, file is uploaded only once but tag need to be added in each and every page of the website.