The Robots Exclusion Protocol is a method used by system administrators to inform visitors robots which directories a site should not be searched by them.

Robot (or robot) is a computer program that automatically traverses the web pages in search of documents in order to index them, validate them or monitor content changes. To control the activities of these robots during their searches, optionally, webmasters can create a file named robots.txt in the root directory of a particular web address. Robots.txt is a file in text format (. Txt) that operates as a "filter" for crawlers and robots Internet search engines, allowing or blocking access to parts or all of a particular site.

by Wikipedia