htmlurlweb-crawlerrobots.txtgoogle-crawlers

How to disallow particular subfolder in robot.txt?


Here is my code

Example URL pattern

www.example.com/folder/subfolder/*

Want to Allow www.example.com/folder/* and Disallow particularly

www.example.com/folder/subfolder/*

How to implement in robot.txt file


Solution

  • You can disallow a folder specifically by adding this line to robots.txt

    Disallow: /folder/subfolder/