Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data Reducer modifier #9

Closed
markhalonen opened this issue Aug 12, 2018 · 5 comments
Closed

Data Reducer modifier #9

markhalonen opened this issue Aug 12, 2018 · 5 comments

Comments

@markhalonen
Copy link
Member

markhalonen commented Aug 12, 2018

We need a data reduction modifier that will prevent large datasets from slowing down the program.

The modifier can be created in 2 places, similar to the lowpass filter:

  1. In the Trim Data window in the Processor
    • The modifier should be applied before
  2. In the Viewer, next to the global filter

The modifier can be specified in 2 ways:

  • Number of points to keep
  • Data collection frequency

My intuition is that this actually needs to be applied while reading the file. Pretend we have a 20GB text file. Can we handle that? The only way to really handle it is to only keep a fraction of the file in memory. If I understand correctly, this is not the current plan. The current plan is to keep everything in memory, but only run our algorithms (filters, etc.) on a reduced dataset, as that's where the slowdown is occurring.

@markhalonen
Copy link
Member Author

Loaded up the example workspace, confirmed the viewer is unusable with 8 of the samples loaded. The app is written in a single-threaded way: any operation takes over the UI thread, so for example if the program is reading a file from disk, the UI become unavailable.

markhalonen added a commit that referenced this issue Aug 14, 2018
@markhalonen
Copy link
Member Author

markhalonen commented Aug 14, 2018

More specfically, I'm seeing that renderCharts() is the #1 culprit, followed by renderSampleResults():

Task
id=2
name=renderSampleResults
duration (s)=5.9
start=1534216451690
end=1534216457547

...

Task
id=10
name=renderCharts
duration (s)=18.1
start=1534216561634
end=1534216579750

The UI is frozen for 18 seconds ❗️

@markhalonen
Copy link
Member Author

This issue could would be more aptly named "Performance issues with multiple large samples". One solution may be a data reduction modifier. Might not be the best solution.

There is currently almost zero effort at performance. For every operation, it just does a broad "render" function that re-calculates everything from scratch, whether it's needed or not. For example, it calculates "Front Face Force" when it is not needed.

@markhalonen
Copy link
Member Author

These samples have about 15,000 data points in the trimmed data. The whole data file is 160,000.

So if we reduced to 1,000 data points, we would expect a 15X speedup.

@markhalonen
Copy link
Member Author

I'm using the VisualVM performance monitor, seeing this:
image

markhalonen added a commit that referenced this issue Aug 15, 2018
markhalonen added a commit that referenced this issue Aug 15, 2018
markhalonen added a commit that referenced this issue Aug 15, 2018
markhalonen added a commit that referenced this issue Aug 15, 2018
markhalonen added a commit that referenced this issue Aug 18, 2018
markhalonen added a commit that referenced this issue Aug 18, 2018
markhalonen added a commit that referenced this issue Aug 21, 2018
markhalonen added a commit that referenced this issue Aug 29, 2018
markhalonen added a commit that referenced this issue Aug 29, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant