Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kape modules don't include hostname #357

Open
antmar904 opened this issue Feb 11, 2025 · 10 comments
Open

Kape modules don't include hostname #357

antmar904 opened this issue Feb 11, 2025 · 10 comments
Assignees

Comments

@antmar904
Copy link

Hi

When running the supported KAPE modules on a endpoint and outputs the supported SOF-ELK JSON, it does not include the hostname. Is there a way we can add this?

@philhagen
Copy link
Owner

could you confirm which version you're running? if it's a version before 20241217, please upgrade to that version, as the KAPE parsers have been entirely overhauled (and aligned to ECS field naming). If you're on that version, can you please provide sample records that are not parsing as expected?

@philhagen philhagen self-assigned this Feb 11, 2025
@antmar904
Copy link
Author

I am running 20241217

@antmar904
Copy link
Author

After I run the KAPE modules, I then scp the json files to /logstash/kape
Is that is correct?

@philhagen
Copy link
Owner

Thanks for confirming. Yes, .json files go into /logstash/kape/ and if they're being even partially parsed, then the pipeline is working. But I'd need at least a few sample records to know if/how the parser is failing and if/how to fix it.

@antmar904
Copy link
Author

antmar904 commented Feb 13, 2025

Ok, I did something. I removed all files in /logstash/kape, ran sof-elk-clear.py. I uploaded a new json file to /logstash/kape and nothing is showing when trying to view the kape-* dataview.

@philhagen
Copy link
Owner

It's possible that files with the same filename will not be loaded properly due to how filebeat tracks them. It's best to put them into subdirectories when running repeated loads (e.g. /logstash/kape/test_001/ etc.) However, I can't provide any meaningful assistance regarding the parsing without samples.

@antmar904
Copy link
Author

Should I be making sub directories named the computer names?

Exp:
/logstash/kape/computer1
/logstash/kape/computer2
/logstash/kape/computer3
/logstash/kape/computer4

@philhagen
Copy link
Owner

the filebeat shipper will traverse through an arbitrarily deep directory structure. It's up to you how that looks.

@antmar904
Copy link
Author

I'm not using filebeat agent. I am manually uploading the json output from KAPE to /logstash/kape. I ended up creating a new sofelk vm.

@philhagen
Copy link
Owner

SOF-ELK ships all locally-placed logs using filebeat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants