htmlweb-crawlerseogoogle-searchgoogle-crawlers

Disallow only homepage ( / ) and allow all other pages for robots.txt


I need to prevent Google web crawler from crawling only my homepage, located in /

But I need to allow all the other pages to be crawled. How can I achieve that?

I tried doing:

User-agent: *
Disallow: /

User-agent: *
Disallow:

But it's not working


Solution

  • You need to use the following for this:

    User-agent: *
    Disallow: /$
    

    The path of the URLs is compared against the Disallow directives. $ designates the end of the match pattern so the Disallow directive will only match https://example.com/ but not https://example.com/foo.