A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot."
Spider, sometimes known as "crawler” or "robot", is a software program which is used by search engine to stay up to date with new coming stuff in the internet. They permanently seeking out changed, removed and modified content on webpages.
Google has their own crawling bot that is sent out to crawl billions of websites daily. Spider friendly sites are the ones that have relevant and quality links on them. Google bot only crawls links, don’t expect the bot to put in login details, if your page cannot be accessed by a link the bot would not see it let alone crawl it.