Prevent search engines from indexing a development server, "add_header" Nginx

I’ve got a server set up for development purposes only. I used ee site create devserver.tk --wpfc, that’s an ordinary install + a letsencrypt update.

I’d like to prevent this development server from being indexed by search engines, so in order to do that, where exactly Should I place the following directive?

add_header X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";

I’ve created a new location / {} block and placed it inside the main server block file on /etc/nginx/sites-available/devserver.tk.conf

Then I restarted nginx with no errors.

But making a curl request to devserver.tk I can’t get to see the newly added directive in the headers response.

I’d really appreciate any help regarding this issue.

you can never force the engines from not indexing, but what you suggested would work. the easier way is to just use the options in wordpress to ask tehm to prevent indexing. I have done this on many installs and so far all is good.

The easiest way to ensure search engines don’t index your Dev site is to setup basic authentication

Here’s a guide https://www.digitalocean.com/community/tutorials/how-to-set-up-password-authentication-with-nginx-on-ubuntu-14-04

Thanks for the input!

I’ve already checked that option on WordPress. I’m confident that the option alone will help, but I also use the WordPress Seo Plugin (by Yoast) which offers a tool for editing the robots.txt file within the WP admin.

I ended up using the following directive inside the robots.txt file, perhaps that’ll help too.

User-agent: *
Disallow: /

Thanks for suggesting the article, it really looks promising. I’ll have a closer look at it.