In this post we are going to list multiple ways of adding a robots.txt file to your django project. Robots.txt or Robots exclusion protocol is basically a text file that defines the rules for search engine robots (or bots) to inform the robots what part of the website should be indexed and what urls should not be indexed. When a search engine robot indexes the pages on your website, they first access the robots.txt file on your website root to check what urls / pages of your website should be indexed. There are multiple ways you can add a robots.txt file to your django project. Following are some quickest and easiest ways to add robots.txt to your django project.

Add Robots.txt dynamically via Response in URLConfig

The quickest and the easiest way to serve a robots.txt from a django project is directly returning a response from your urlpatterns to the url that points to robots.txt. For Example.

Here we sent a plain text type response containing the rules directly to the robots.txt url. This approach is useful if you have a simple rule defined. For multiple and more complex rules for the robots you may need the look into other options to serve robots.txt rules.

Serve robots.txt using the TemplateView

If you have multiple rules defined for your robots.txt you may want to use a TemplateView and add a robots.txt template in your template directory and then add the rules to it. For Example

Once you have defined the above. Simply add a robots.txt file to your templates directory and define your rules inside that file like:

Serve robots.txt directly from your web server.

Another option is to directly serve robots.txt file through your web server. Following is sample configuration for apache web server.

Add the following to your apache configuration.

And move the robots.txt file to root of your project with appropriate permissions.