If you have multiple websites on a server or wildcard websites and you don’t want to create 1000 robots.txt files, but you want to disallow search engines access, you can do the following elegant solution:
Create a robots.txt with the content below:
Then create your robots.txt file somewhere on your server.
Add the following to your httpd.conf:
# Exclude all robots
Alias /robots.txt /path/to/robots.txt
Restart your httpd server and you’re good to go!