Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in PETSc Hypre PILUT preconditioner #2966

Open
mikekryjak opened this issue Aug 26, 2024 · 2 comments
Open

Memory leak in PETSc Hypre PILUT preconditioner #2966

mikekryjak opened this issue Aug 26, 2024 · 2 comments

Comments

@mikekryjak
Copy link
Contributor

I'm running Hermes-3 in 2D and trying to optimise PETSc preconditioners, focusing on the ILU methods Euclid and PILUT. Ignoring the fact both are now deprecated in Hypre, I found that PILUT is giving me a significant memory leak where the more solver steps there are, the more the memory increases. I've had a 10 hour simulation grow from 1GB to 25GB. I have tested Euclid and CVODE for the same case and they don't grow their memory usage.

Here is my input deck:

[solver]
type = beuler # Backward Euler steady-state solver
snes_type = newtonls # Nonlinear solver
ksp_type = gmres # Linear solver
max_nonlinear_iterations = 10
pc_type = hypre # Preconditioner type
pc_hypre_type = pilut # Hypre preconditioner type
lag_jacobian = 500 # Iterations between jacobian recalculations
atol = 1e-12 # Absolute tolerance
rtol = 1e-8 # Relative tolerance

[petsc]
pc_hypre_pilut_tol = 1e-7 # Default: 1e-4
options_view # All picked up options
options_left # Unused/unrecognised options
log_view # Performance info

@johnomotani
Copy link
Contributor

Does the same thing happen with the logging disabled (i.e. remove log_view option)? Just trying to rule out one possible cause.

@mikekryjak
Copy link
Contributor Author

@johnomotani yes, I just did the test to make sure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants