Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ticket Expired : Logstash + WebHdfs + Kerberos #32

Open
IbaiAlonso opened this issue Oct 19, 2017 · 3 comments
Open

Ticket Expired : Logstash + WebHdfs + Kerberos #32

IbaiAlonso opened this issue Oct 19, 2017 · 3 comments

Comments

@IbaiAlonso
Copy link

Hi,

We are trying to use this plugins but we are facing a problem when Kerberos is enabled (When kerberos is disabled, works fine).

[2017-10-19T13:11:27,226][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/netent.template.json
[2017-10-19T13:11:27,249][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://std-elasticsearch-poc.nix.cydmodule.com:9200"]}
[2017-10-19T13:11:28,097][ERROR][logstash.outputs.webhdfs ] Webhdfs check request failed. (namenode: 10.72.19.17:50070, Exception: gss_init_sec_context did not return GSS_S_COMPLETE: Unspecified GSS failure. Minor code may provide more information
Ticket expired
)
[2017-10-19T13:11:28,109][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"#<LogStash::OutputDelegator:0x3dc3e8a @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x74150eec @Metric=#<LogStash::Instrument::Metric:0x5bd75dae @collector=#<LogStash::Instrument::Collector:0x494171d5 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x2b0b2e19 @store=#<Concurrent::Map:0x000000000688bc entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x77b537fa, @fast_lookup=#<Concurrent::Map:0x000000000688c0 entries=760 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :"37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177"]>, @Metric=#<LogStash::Instrument::NamespacedMetric:0x54632a4e @Metric=#<LogStash::Instrument::Metric:0x5bd75dae @collector=#<LogStash::Instrument::Collector:0x494171d5 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x2b0b2e19 @store=#<Concurrent::Map:0x000000000688bc entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x77b537fa, @fast_lookup=#<Concurrent::Map:0x000000000688c0 entries=760 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs]>, @logger=#<LogStash::Logging::Logger:0x783e44fd @logger=#Java::OrgApacheLoggingLog4jCore::Logger:0x27c13500>, @out_counter=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :"37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177", :events] key: out value: 0, @in_counter=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :"37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177", :events] key: in value: 0, @strategy=#<LogStash::OutputDelegatorStrategies::Legacy:0x214f33f0 @worker_count=1, @Workers=[<LogStash::Outputs::WebHdfs codec=><LogStash::Codecs::JSON id=>"json_9fd7e005-e137-4589-b17d-2fcfc9d3ba14", enable_metric=>true, charset=>"UTF-8">, host=>"10.72.19.17", port=>50070, path=>"/user/logstash/%{type}-%{site}-%{+YYYY.MM.dd}-%{[@metadata][thread_id]}.log", user=>"hdfs", flush_size=>500, compression=>"snappy", idle_flush_time=>10, retry_interval=>10, workers=>1, retry_times=>10, single_file_per_thread=>true, use_kerberos_auth=>true, kerberos_keytab=>"/etc/logstash/conf.d/hdfs/spnego.service.keytab", id=>"37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177", enable_metric=>true, standby_host=>false, standby_port=>50070, open_timeout=>30, read_timeout=>30, use_httpfs=>false, retry_known_errors=>true, snappy_bufsize=>32768, snappy_format=>"stream", use_ssl_auth=>false>], @worker_queue=#SizedQueue:0x2ca172cf>, @id="37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177", @time_metric=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :"37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177", :events] key: duration_in_millis value: 0, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x5c93193 @Metric=#<LogStash::Instrument::Metric:0x5bd75dae @collector=#<LogStash::Instrument::Collector:0x494171d5 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x2b0b2e19 @store=#<Concurrent::Map:0x000000000688bc entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x77b537fa, @fast_lookup=#<Concurrent::Map:0x000000000688c0 entries=760 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :"37f78fe6277c84bb2c35f6ec985bd6afbdfbb97c-177", :events]>, @output_class=LogStash::Outputs::WebHdfs>", :error=>"gss_init_sec_context did not return GSS_S_COMPLETE: Unspecified GSS failure. Minor code may provide more information\nTicket expired\n"}
[2017-10-19T13:11:30,184][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<WebHDFS::KerberosError: gss_init_sec_context did not return GSS_S_COMPLETE: Unspecified GSS failure. Minor code may provide more information
Ticket expired

output {
webhdfs {
codec => json
host => "xxxxxxx"
port => 50070
path => "/user/logstash/%{type}-%{site}-%{+YYYY.MM.dd}-%{[@metadata][thread_id]}.log"
user => "hdfs"
flush_size => 500
compression => "snappy"
idle_flush_time => 10
retry_interval => 10
workers => 1
retry_times => 10
single_file_per_thread =>true
use_kerberos_auth => true
kerberos_keytab => "/etc/logstash/conf.d/hdfs/hdfs.service.keytab"
}
}

I have some questions regarding the ticket and the keytab. Do we need a logstash account as kerberos principal? Or how is the approach for the setup with Kerberos?.

SO:Red Hat Enterprise Linux Server release 7.4
Elastic Version: 5.6.1

@sgalinma
Copy link

please, could you explain how did you solve this problem?

@IbaiAlonso
Copy link
Author

please email me [email protected]

@kumaravel29
Copy link

Facing the same issue.
Can you please share the solution?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants