-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MPI Friendly GC #178
Comments
@jkozdon Note there is a caveat here with the use of |
Good to know. Thanks! |
@jkozdon If you are still interested in this package, I am happy to give you commit access (if you don't already have it). |
Hi @ViralBShah. I have abandoned this project. I left academia, and my collaborators ended up going a different way with their work anyway. I have admin privileges, but I am happy to hand them over to others who want to maintain and use this. I think that @boriskaus and I are technically the owners right now. |
We should consider implementing what
GridapPETSc.jl
has done for GC with mpi objects.Basically the julia finalizer registers the object for destruction with
PetscObjectRegisterDestroy
, see for examplePETScLinearSolverNS
Of course this means the object is not destroyed until PETSc is finalized. If the user wants to destroy things sooner they can call a function
gridap_petsc_gc
:By first calling
GC.gc()
all objects will be properly registered viaPetscObjectRegisterDestroy
and the call toPetscObjectRegisterDestroyAll
actually destroys then.The only change I would make is to suggest still allow manual destruction of objects is this is desired for performance reason (though I don't know if this is really ever needed).
h/t: @amartinhuertas in #146 (comment)
The text was updated successfully, but these errors were encountered: