Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Newer versions choke when connected large amounts of live logs #619

Open
linker-err0r opened this issue Jan 23, 2025 · 5 comments
Open

Newer versions choke when connected large amounts of live logs #619

linker-err0r opened this issue Jan 23, 2025 · 5 comments
Labels

Comments

@linker-err0r
Copy link

linker-err0r commented Jan 23, 2025

Description: Our project heavily uses DLT logs throughout and usually spits out around a 100k to 250k logs per second. However, currently released versions of the app have turned out to be non-performant (at least on MacOS). Newer developers to the project like myself have to check out and build an older commit hash (ab646701fc0cfe0d96ed68cc9e05df4121c65e74) in order to find a version that does not pinwheel continuously listening to live logs. This makes the App Store and other versions of the app packaged on GitHub unusable for live logging, forcing us to rely on unsigned self-built binaries. Static log analysis seems to work fine on these versions -- the issue only seems to occur when connected to a live ECU over TCP.

Prerequisites:

  • A logging source or device that produces upwards of 250k logs per second
  • A MacOS PC running any version of the app newer than ab646701fc0cfe0d96ed68cc9e05df4121c65e74

Reproduction Steps:

  1. Connect to ECU source over TCP using default settings.
  2. Observe pinwheeling/choking behavior.

System Information:
MacBook Pro 16-inch Nov 2023
Chip: Apple M3 Max
Memory: 36 GB
MacOS: Sequoia 15.2

@vifactor
Copy link
Collaborator

TBH so far I do not understand what is special about next commit after ab646701fc0cfe0d96ed68cc9e05df4121c65e74. Are you sure that checking out 1-2 commits above leads to the mentioned performance issues?

P.S. I have heard similar complains for Windows, but I do not know if those are related

@vifactor vifactor added the macOS label Jan 23, 2025
@alexmucde
Copy link
Collaborator

Yes similar issue was als reported to me for Windows, but only with latest version 2.27.0. I have no reports for 2.26.0 from mid 2024.

We need to find out which commit exactly causes the issue.
ab64670 seems very old from Commits on Jan 19, 2024.

@linker-err0r Would be good if you can help here to find out the right commit, i know it is much work to do.

Here i made a big change, which could possibly cause the issue:

Commits on Nov 21, 2024
5e98bc6
Remove extra thread for filter index to improve performance.

Perhaps you can start testing the version before:

Commits on Nov 20, 2024
e7cd437
[Merge pull request]

@linker-err0r
Copy link
Author

linker-err0r commented Jan 24, 2025 via email

@0penSrc
Copy link

0penSrc commented Jan 30, 2025

Hi there,

I just wanted to provide a quick update on this thread.

I encountered some of the same issues that Linker reported earlier. To troubleshoot, I tried using e7cd437, which is just before the major changes were made, but unfortunately, the issue with the software not handling incoming logs still persisted.

Furthermore, as I built locally, a terminal launched to give me live status updates on what the application was doing. The only thing I noticed was that the "CI:" had a percentage that was often under 100%? For example:

CI: 100 %
CI: 46 %
CI: 100 %
CI: 31 %

Afterward, I reverted to the hash Linker provided, and that has resolved the issue.

Hoping this gets resolved at some point, let me know if I can provide any more context.

@vifactor
Copy link
Collaborator

Hi @0penSrc , thanks for your info. Unfortunately, the range between "good" (ab64670) and your "bad" e7cd437 commit is huge. Moreover, I think we have some confidence that things went bad somewhere between ab64670 and v2.26 (aka 8a8305c).

The most straightforward way to check which commit brought regression is to use git bisect functionality. I would do it myself, but unfortunately so far I did not have chance to reproduce the issue. If you can give it a try, it would be superhelpful, otherwise it would be helpful at least to know if the issue exists for you in v2.26.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants