File robots.txt — this is the main file that describes processing rules pages of the search engines. This file is needed to specify the primary site name, site map (sitemap.xml), open and closed sections of the site.
File robots.txt includes the following directives:
In addition to the directives in robots.txt use special characters:
To compile robots.txt use the above guidelines and sung by the characters on the following principle:
If the site is not prohibited sections, robots.txt must be at least 4 stitches:
Check robots.txt and it affect the indexing of the website, you can tools Yandex