You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using Okta module for filebeat with ECK operator, logs are getting fetched and shipped to elasticsearch properly, but i am also trying to drop some fields before shipping them, it is still sending all the fields, this can be a mistake in my config also which i am pasting below. Please help on this issue
Filebeat modules use Elasticsearch Ingest Node to apply most of the transformations to the events1. So it is likely that the field names that you are referencing do not come into existence until the data passes through Elasticsearch. The pipelines it uses is here.
One option to customize the behavior would be to add a final_pipeline to your filebeat data stream that performs the additional transformations that you want.
Another option would be to modify the included pipelines to do what you want.
I am using Okta module for filebeat with ECK operator, logs are getting fetched and shipped to elasticsearch properly, but i am also trying to drop some fields before shipping them, it is still sending all the fields, this can be a mistake in my config also which i am pasting below. Please help on this issue
Filebeat config
The text was updated successfully, but these errors were encountered: