canonical link : Duplicate content creates a problem for the search engine and therefore for your rankings. When your site has copies of the same content, the search engine has a hard time determining which page to rank for which search query.
Robots.txt : The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.