ruby-on-railsrollbar

How to prevent rollbar from reporting SEO crawlers activities?


I have setup rollbar in my rails application. It keeps reporting recordnotfound which is as a result of SEO scrawlers (i.e Google bot, Baidu, findxbot etc..) searching for deleted post.

How to prevent rollbar from reporting SEO scrawler activities.


Solution

  • Looks like you are using rollbar-gem, so you'd want to use Rollbar::Ignore to tell Rollbar to ignore errors that were caused by a spider

    handler = proc do |options|
      raise Rollbar::Ignore if is_crawler_error(options)
    end
    
    Rollbar.configure do |config|
        config.before_process << handler
    end
    

    where is_crawler_error detects if the request that led to the error was from a crawler.

    If you are using rollbar.js to detect errors in client-side Javascript, then you can use the checkIgnore option to filter out client-side errors caused by bots:

    _rollbarConfig = {
      // current config...
      checkIgnore: function(isUncaught, args, payload) {
         if (window.navigator.userAgent && window.navigator.userAgent.indexOf('Baiduspider') !== -1) {
           // ignore baidu spider
           return true;
         }
         // no other ignores
         return false;
       }
    }