Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to estimate memory usage with large data size #59

Open
ajdemery93 opened this issue Jul 15, 2023 · 0 comments
Open

How to estimate memory usage with large data size #59

ajdemery93 opened this issue Jul 15, 2023 · 0 comments

Comments

@ajdemery93
Copy link

Hello,

I am trying to run corHMM on a large dataset (4518 spp) with four traits (3 binary, 1 categorical with ~10 states). I've successfully run through the tutorial vignettes, and successfully ran corHMM with a subset of my data (119 spp., same number of traits, the categorical having 4 states). On my bigger dataset on a machine with 24 cores, I get the "bus error (core dumped)" message.

My question is, how can I estimate the memory usage given my full data size so that I know the number of cores and time I need to successfully run the analysis. I appreciate your help, and I'm happy to answer further questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant