Robots.txt with Nginx on NixOS

by|inArticles||1 min read
Robots.txt with Nginx on NixOS<br>
Robots.txt with Nginx on NixOS<br>

A robots.txt file can usually be implemented using your framework of choice (for instance for me with Servant or Yesod). But there is as well a way to handle this request by using your reverse proxy Nginx. 

To enable this response we just need to add a new entry in our configuration.nix file using the extraConfig config option of a location:

  virtualHosts."mydomain.com" = {
      enableACME = true;
      forceSSL = true;

      locations."/robots.txt" = {
        extraConfig = ''
          rewrite ^/(.*)  $1;
          return 200 "User-agent: *\nDisallow: /";
        '';
      };

      locations."/" = {
        proxyPass = "http://localhost:8000";
      };
    };
  };

In this example I used a configuration to forbid any bot access to a domain (usually I do it for subdomains of my projects). After your modifications we need to rebuild so the changes can be verified and applied.

sudo nixos-rebuild switch

And there you have it, your server will now respond to a request to /robots.txt with the content you defined. Super simple!

Thank you for reading this far! Let’s connect. You can @ me on X (@debilofant) with comments, or feel free to follow. Please like/share this article so that it reaches others as well.

Related Articles

© Copyright 2024 - ersocon.net - All rights reservedVer. 415