-
Couldn't load subscription status.
- Fork 63
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Current Behaviour
When trying to fit toys, I need to:
- Resample the data
- Recreate the likelihood with updated constraints
When I do that, the memory seems not to be released. Specifically, the likelihood recreation seems to be causing a memory leak:
import zfit
def fun():
obs = zfit.Space('x', limits=(0, 10))
mu = zfit.Parameter('mu', 5, 0, 10)
sg = zfit.Parameter('sg', 1, 0, 3)
gauss = zfit.pdf.Gauss(obs=obs, mu=mu, sigma=sg)
sampler = gauss.create_sampler(n=1000)
nll = zfit.loss.UnbinnedNLL(model=gauss, data=sampler)
minimizer = zfit.minimize.Minuit()
for run_number in tqdm.trange(100, ascii=' -'):
sampler.resample()
nll_new = nll.create_new()
minimizer.minimize(nll_new)
fun()Expected Behaviour
nll_new stops existing after each iteration, that should trigger the garbage collector but that does not seem to happen.
Context (Environment)
- zfit version: 0.27.1
- Python version: 3.12.11
- Are you using conda, pipenv, etc? : micromamba
- Operating System: almalinux9
- Tensorflow version: 2.19.1
Possible Solution/Implementation
I might be able to re-use the likelihood by parametrizing the constraints, I am still checking. I tried to run the fit in a new subprocess, but it seems that the likelihood is not pickable.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working