So, I have a development server running OpenSuse with apache. There are loads of projects on this server, some of them have an online/live counterpart.
Each project has a separate subdomain.
How do I rewrite all requests for robots.txt to one "default"-file, server wide?
My goal is to prevent indexing from search-bots.
I believe there is no easier way than to set an Alias
in every VirtualHost
directive:
Alias /robots.txt /home/path/to/generic/robots.txt
I'm happy to stand corrected by a truly global solution, though.