You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I started an issue the other day about a memory leak in TerriaJS. It appears loaded data is not disposed off properly, to the point where the browser eventually runs out of memory and crashes if the page is not reloaded often enough. In my issue I point to a specific data set accumulating 3-3.5GB resulting in almost immediate crash most of the time. I should mention that we use the IsEnabled=false flag on CatalogItems for deletion. Neither that, nor forcing the cleaning of lists such as datasources gave any results.
However, I tested the "buses" data set, under Live data in the NSW Digital Twin project. This particular data set seems to gather memory up to around 1.5GB, after which it disposes and jumps down to 400mb. My question to this is, how come Terria is able to clean up this data set while CSV data in the same release indicates a memory leak?
I'd appreciate any input as to how this live data work or how it disposes off the entities, and any insight into the memory problem as a whole.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I started an issue the other day about a memory leak in TerriaJS. It appears loaded data is not disposed off properly, to the point where the browser eventually runs out of memory and crashes if the page is not reloaded often enough. In my issue I point to a specific data set accumulating 3-3.5GB resulting in almost immediate crash most of the time. I should mention that we use the IsEnabled=false flag on CatalogItems for deletion. Neither that, nor forcing the cleaning of lists such as datasources gave any results.
However, I tested the "buses" data set, under Live data in the NSW Digital Twin project. This particular data set seems to gather memory up to around 1.5GB, after which it disposes and jumps down to 400mb. My question to this is, how come Terria is able to clean up this data set while CSV data in the same release indicates a memory leak?
I'd appreciate any input as to how this live data work or how it disposes off the entities, and any insight into the memory problem as a whole.
Beta Was this translation helpful? Give feedback.
All reactions