Dynamic robots.txt – block search engines for all robots.

If you have multiple websites on a server or wildcard websites and you don’t want to create 1000 robots.txt files, but you want to disallow search engines access, ¬†you can do the following elegant solution:

Create a robots.txt with the content below:
User-agent: *
Disallow: /

Then create your robots.txt file somewhere on your server.

Add the following to your httpd.conf:

# Exclude all robots
<Location “/robots.txt”>
SetHandler None

Alias /robots.txt /path/to/robots.txt

Restart your httpd server and you’re good to go!

Leave a Reply

Your email address will not be published. Required fields are marked *

CommentLuv badge