You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've hit this with docker (explained to me by @avsm) that the default limits on shared memory are quite restrictive in terms of the datasets we work with. I've been chasing some random sigbus errors in shark (process exits with signal 135) and I believe this is the same root cause.
This ticket is to remind me to look at making this configurable or at least have run use a higher bound. For now I can work around it by disabling parallelism in yirgacheffe
The text was updated successfully, but these errors were encountered:
Yeah, I believe docker is the same, but 64MB is too small in a geospatial context. I added parallel_save to which shares data via shared memory, and really I think in this context we should set it to a large %age of actual memory. But if Obuilder is merged we can easily make that change for our use case.
I've hit this with docker (explained to me by @avsm) that the default limits on shared memory are quite restrictive in terms of the datasets we work with. I've been chasing some random sigbus errors in shark (process exits with signal 135) and I believe this is the same root cause.
This ticket is to remind me to look at making this configurable or at least have run use a higher bound. For now I can work around it by disabling parallelism in yirgacheffe
The text was updated successfully, but these errors were encountered: