-
-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use NDJSON for consistency with natively exported log data #110
Comments
Hi, There are some known differences in field names depending on the method of exporting GUI vs Graph, beta, or non-beta cmdlets, etc. This is something Microsoft controls. We simply retrieve the logs and leave them in their default format, without modifying the field names. Attempting to specify and map all the fields would quickly become complex and messy, especially since we’d need to constantly stay updated on any new fields added by Microsoft. Regarding the sign-in logs, we offer two methods for acquiring them. One method uses Graph, as shown below:
We simply write the Using the Azure AD PowerShell module as an alternative, we can retrieve the sign-in logs by running: When comparing the output of the two methods, Graph uses lowercase for field names (e.g., createdDateTime), whereas Azure AD outputs them with uppercase (e.g., CreatedDateTime). In both cases, we do not modify the fields. To achieve consistent output between the two, we would need to alter the default output provided by Microsoft. Regarding the NDJSON format, we need to discuss internally how we want to handle this. We must evaluate the potential impact of changing the output from JSON to NDJSON on the tools built around our output (example: https://github.com/evild3ad/Microsoft-Analyzer-Suite) and the workflows created by everyone using the tool loading the data. We have never experienced issues ourselves, nor have we heard from others that their tools like Splunk, ELK, Data Explorer etc have difficulty loading the default JSON output. However, I’m not too familiar with SOF-ELK and how it processes its data so will need to look into this further. We’re currently working on a "big" update, and one of the new features will be the |
Accepted a great pull request from @cirosec, adding an output option for Sof-elk in the Get-UAL scripts. I'll push it to the PowerShell Gallery with the next update. If you want to use it now, you can clone the GitHub repository. |
Thank you @JoeyInvictus for quickly testing and merging my pull request. With the version currently on the main branch, it is now possible to obtain UAL logs in a format, which can be parsed by sof-elk using the following command:
I am currently working on implementing a similar fix for the I hope, that I can publish the pull requests required for that in the coming days. |
I'd suggest changing the output name from Could you provide a sample of this output (private via email is fine) so I can test as well? |
Hey, @philhagen, I agree, it makes sense to rename it to The pull-request (#115), which just got merged, should now also allow to import the logs from the I hope, that I can supply you with some test data next week. |
@0xffr Yes, I've already replaced everything with SOF-ELK for the next update. I'll add the same logic to the Entra Audit Logs via Graph as well. Not sure if there’s a parser ready in SOF-ELK, but at least the data will be ready when there is. Working on V3.0.0, if it's live I will push all changes to the Powershell gallery as well |
Hi, we just published the new update, so sof-elk should be supported from our side now by using |
I don't have permission to re-open this issue but would like to request it is because because two of the formats are not parsed. They are not consistent with the native Azure tool/console/API/whatever output that is used in the FOR509 course data (which I'm using as a reference point).
|
(This issue is a dependency for philhagen/sof-elk#274)
The current JSON output does not match the format that is natively exported from Azure. The native output is in NDJSON form, or:
The field names should also match the case of those in the native output (initial letter lowercased with camelCase in remainder of field name). There appear to be other differences between the output of this suite of tools and the natively exported format - for example, there is no "category" set on the SignInLogs sample I received. However, as long as they are consistent between both export processes, everything should be fine.
Output in this format will allow handling the UAL data in the same manner as that of natively exported logs. While this is a specifically noted blocker for SOF-ELK, it would benefit any tooling that parses the UAL from this tool.
The text was updated successfully, but these errors were encountered: