I have a custom robots.txt for my site. I interest to disallow sitemap.xml, so my question: Is it enough to write:
Disallow: /sitemap.xml ?
This robots.txt
User-agent: *
Disallow: /sitemap.xml
allows all bots to crawl everything on your host except URLs whose paths start with /sitemap.xml. So these URLs, for example, should not get crawled:
http://example.com/sitemap.xmlhttp://example.com/sitemap.xml.tar.gzhttp://example.com/sitemap.xml/http://example.com/sitemap.xml/fooSo if your sitemap is at http://example.com/sitemap.xml, conforming bots won’t crawl it.