Robots.txt file is used to give instruction to crawlers/searchbots. For example which website page/ directories we want to crawl, follow, index etc.
it's format is
User-agent: * (* means all search bots like Googlebot, scooter etc.)
Disallow: / (By Disallowing we are restricting searchbots not to follow any or all page or folder. For eg: wp-admin, wp-include etc.)
Allow: ( allowing means allow all searchbots to crawl and index all web pages)