robots.txt

Disallow PHP page including all URL parameters with robots.txt


How would I disallow all dynamic pages within my robots.txt?

E.g.

I would like page.php AND all possible dynamic versions to be disallowed.

At the moment I have

User-Agent: *
Disallow: /page.php

But this still allows e.g. page.php?hello=there


Solution

  • What you've already got should block all access to /page.php for all search engines which respect robots.txt (no matter whether there are any query string parameters provided)

    Don't forget robots.txt is only for robots :-) If you're trying to block users from accessing the page you'll need to use .htaccess or similar