I have subdomain "klient" for testing websites for our clients and I don't want that to be indexed. I have set in robots.txt (in root of our web) this:
User-agent: *
disallow: /subdom/klient/*
But I'm not sure, if it does really work, because I have now found testing site in google results...
Where could be problem or how could we stop google and other bots indexing this folder...?
Thank you
You can do that by placing a robots.txt
in the root directory of your subdomain.
So in your klient.example.com
, place a robots.txt
with the following content:
User-agent: *
Disallow: /
If you want to remove already indexed pages, create a new webconsole site with your subdomain and remove the indexed pages there.