logstashwebhdfs

logstash-output-webhdfs Failed to flush outgoing items


I want to send data to hdfs through logstash-output-webhdfs. The config of logstash is:

input{
    file{
            path => "/root/20160315.txt"
    }
}

output{
    webhdfs{
            host => "x.x.x.x"
            path => "/user/logstash/dt=%{+YYYY-MM-dd}/logstash-%{+HH}.log"
            user => "logstash"
    }
    stdout{
            codec => rubydebug
    }
}

but I get the following error:

Failed to flush outgoing items {:outgoing_count=>1, :exception=>"WebHDFS::ServerError", :backtrace=>["/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:351:in `request'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:270:in `operate_requests'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:73:in `create'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-2.0.4/lib/logstash/outputs/webhdfs.rb:211:in `write_data'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-2.0.4/lib/logstash/outputs/webhdfs.rb:195:in `flush'", "org/jruby/RubyHash.java:1342:in `each'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-2.0.4/lib/logstash/outputs/webhdfs.rb:183:in `flush'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in `buffer_flush'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:159:in `buffer_receive'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-2.0.4/lib/logstash/outputs/webhdfs.rb:166:in `receive'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/outputs/base.rb:83:in `multi_receive'", "org/jruby/RubyArray.java:1613:in `each'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/outputs/base.rb:83:in `multi_receive'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/output_delegator.rb:130:in `worker_multi_receive'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/output_delegator.rb:114:in `multi_receive'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:305:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:305:in `output_batch'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:236:in `worker_loop'", "/root/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:205:in `start_workers'"], :level=>:warn}

Is anyone ever met this problenm.


Solution

  • It seems you should set user option of logstash-output-webhdfs to the hdfs supergroup user,which is the user you use to start hdfs.For example ,if you use root to run start-dfs.sh bash,then the user option shuold be root.

    In addition, you should edit /etc/hosts ,add the hdfs cluster node route list .