Roboto is a Rails Engine that gives you the ability to specify enviornment specific robots in your Rails 4.2+ application.
Don't let crawlers access your staging environment. This is bad for SEO.
You can add it to your Gemfile with:
After you need to run the generator:
#> rails generate roboto:install
If you already have robots.txt, it will be kept for your production environment in config/robots/production.txt
You can now specify environment specific robots.txt files in config/robots/. By default crawlers are disallow from accessing your site has been made for all your environments.
- Fork it
- Create your feature branch (
git checkout -b my-new-feature)
- Commit your changes (
git commit -am 'Added some feature')
- Push to the branch (
git push origin my-new-feature)
- Create new Pull Request