-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data Reducer modifier #9
Comments
Loaded up the example workspace, confirmed the viewer is unusable with 8 of the samples loaded. The app is written in a single-threaded way: any operation takes over the UI thread, so for example if the program is reading a file from disk, the UI become unavailable. |
More specfically, I'm seeing that
The UI is frozen for 18 seconds ❗️ |
This issue could would be more aptly named "Performance issues with multiple large samples". One solution may be a data reduction modifier. Might not be the best solution. There is currently almost zero effort at performance. For every operation, it just does a broad "render" function that re-calculates everything from scratch, whether it's needed or not. For example, it calculates "Front Face Force" when it is not needed. |
These samples have about 15,000 data points in the trimmed data. The whole data file is 160,000. So if we reduced to 1,000 data points, we would expect a 15X speedup. |
We need a data reduction modifier that will prevent large datasets from slowing down the program.
The modifier can be created in 2 places, similar to the lowpass filter:
The modifier can be specified in 2 ways:
My intuition is that this actually needs to be applied while reading the file. Pretend we have a 20GB text file. Can we handle that? The only way to really handle it is to only keep a fraction of the file in memory. If I understand correctly, this is not the current plan. The current plan is to keep everything in memory, but only run our algorithms (filters, etc.) on a reduced dataset, as that's where the slowdown is occurring.
The text was updated successfully, but these errors were encountered: