Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MPI Friendly GC #178

Open
jkozdon opened this issue Dec 7, 2021 · 4 comments
Open

MPI Friendly GC #178

jkozdon opened this issue Dec 7, 2021 · 4 comments

Comments

@jkozdon
Copy link
Member

jkozdon commented Dec 7, 2021

We should consider implementing what GridapPETSc.jl has done for GC with mpi objects.

Basically the julia finalizer registers the object for destruction with PetscObjectRegisterDestroy, see for example PETScLinearSolverNS

Of course this means the object is not destroyed until PETSc is finalized. If the user wants to destroy things sooner they can call a function gridap_petsc_gc:

# In an MPI environment context,
# this function has global collective semantics.
function gridap_petsc_gc()
  GC.gc()
  @check_error_code PETSC.PetscObjectRegisterDestroyAll()
end

By first calling GC.gc() all objects will be properly registered via PetscObjectRegisterDestroy and the call to PetscObjectRegisterDestroyAll actually destroys then.

The only change I would make is to suggest still allow manual destruction of objects is this is desired for performance reason (though I don't know if this is really ever needed).

h/t: @amartinhuertas in #146 (comment)

@amartinhuertas
Copy link
Contributor

@jkozdon Note there is a caveat here with the use of PetscObjectRegisterDestroy. PETSc holds a global data structure with registered objects for lazy destroy that has a maximum capacity. By default, it is 256 (although it can be increased via the corresponding CPP macro during configuration stage). If you exceed such size, then an error is produced. (see gridap/GridapPETSc.jl#42 for more details). Our workaround here is to inject calls to gridap_petsc_gc() at strategic points within GridapPETSc.jl. I know it is far from ideal, but this is the best idea that came to my mind given such constraints.

@jkozdon
Copy link
Member Author

jkozdon commented Dec 7, 2021

Good to know. Thanks!

@ViralBShah
Copy link
Member

@jkozdon If you are still interested in this package, I am happy to give you commit access (if you don't already have it).

@jkozdon
Copy link
Member Author

jkozdon commented Oct 5, 2024

Hi @ViralBShah. I have abandoned this project. I left academia, and my collaborators ended up going a different way with their work anyway.

I have admin privileges, but I am happy to hand them over to others who want to maintain and use this. I think that @boriskaus and I are technically the owners right now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants