You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2018-06-07T10:26:37,363][ERROR][logstash.outputs.webhdfs ] Max write retries reached. Events will be discarded. Exception: {"RemoteException":{"message":"Failed to APPEND_FILE /logstash/dt=2018-06-07/logstash-02.log for DFSClient_NONMAPREDUCE_-692957599_23 on 192.168.0.3 because this file lease is currently owned by DFSClient_NONMAPREDUCE_382768740_23 on 192.168.0.3","exception":"RemoteException","javaClassName":"org.apache.hadoop.ipc.RemoteException"}}
[2018-06-07T10:26:40,009][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x415fbb96 sleep>"}
For all general issues, please provide the following details for fast resolution:
Version: 6.2.4
Operating System: centos 6.5
Config File (if you have sensitive info, please remove it):
port => "14000"
use_httpfs => "true"
path => "/logstash/dt=%{+YYYY-MM-dd}/logstash-%{+HH}-%{+mm}-%{+ss}.log"
user => "hadoop"
Sample Data:
Steps to Reproduce:
the first time will be successful,but the second time will be failed where append to hdfs!
And hdfs no support append file, we mush call file close method after write data info hdfs file!
The text was updated successfully, but these errors were encountered:
[2018-06-07T10:26:37,363][ERROR][logstash.outputs.webhdfs ] Max write retries reached. Events will be discarded. Exception: {"RemoteException":{"message":"Failed to APPEND_FILE /logstash/dt=2018-06-07/logstash-02.log for DFSClient_NONMAPREDUCE_-692957599_23 on 192.168.0.3 because this file lease is currently owned by DFSClient_NONMAPREDUCE_382768740_23 on 192.168.0.3","exception":"RemoteException","javaClassName":"org.apache.hadoop.ipc.RemoteException"}}
[2018-06-07T10:26:40,009][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x415fbb96 sleep>"}
For all general issues, please provide the following details for fast resolution:
port => "14000"
use_httpfs => "true"
path => "/logstash/dt=%{+YYYY-MM-dd}/logstash-%{+HH}-%{+mm}-%{+ss}.log"
user => "hadoop"
the first time will be successful,but the second time will be failed where append to hdfs!
And hdfs no support append file, we mush call file close method after write data info hdfs file!
The text was updated successfully, but these errors were encountered: