Create A Dynamic Robots.txt In Rails

Posted By Weston Ganger

I needed to have a dynamic robots.txt because I was serving two different sites from one Rails application. One common case is the need to block robots in the staging environment. In this article we will set this up.

First we set up a route for the robots:

# app/views/pages/robots.html.erb
<% if Rails.env.production? %>
  User-Agent: *
  Allow: /
  Disallow: /admin
  Sitemap: http://www.mysite.com/sitemap.xml
<% else %>
  User-Agent: *
  Disallow: /
<% end %>

Now the robots will be disallowed for the entire app unless its running in the production environment.

Related External Links:

Article Topic:Software Development - Ruby / Rails

Date:July 28, 2016