A spiders.txt computer file is a written text computer file that prevents web spider software, such as Googlebot, from creeping certain pages of your site. The computer file is basically a list of instructions, such Allow and Stop , that tell web spiders which URLs they can or cannot recover.