This utility allows you to validate the syntax of a robots.txt file. A robots.txt file (and not robot.txt file, which is an often made misstake) can be used to instruct spiders and crawlers which parts of your website may be crawled for search engine indexing. The robots.txt file should be placed in the root directory of a website. Robots.txt files placed in subdirectories are not valid.
When the option "Deep check" is used, the validator tries to validate the values used in sitemap:, allow: and disallow: for their existence (handy to detect misspelled, removed or unintentionally left unprotected directories in a website)