-
-
Notifications
You must be signed in to change notification settings - Fork 7.9k
Can no longer deepcopy LogNorm objects on master #18119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
So norms now derive from scales (#16457), and scales have probably never been deep copyable. import matplotlib.scale as mscale
import copy
sc = mscale.LogTransform(10)
sc2 = copy.deepcopy(sc)
Previously norms did not use the transform stack, but now that they are based on scales, they use it. I guess a "fix" is to write a |
The specific use case here is passing a norm to a plot method that autoscales vmin/vmax in place, but taking a copy to avoid modifying the vmin/vmax on the original norm object. |
Implementing a ie for def __deepcopy__(self, memo):
return self.__init__(self._axis_name, base=self.base, subs=self.subs,
nonpositive=self._nonpositive) |
Ugh, but the same song and dance would have to be done for the |
Does one need to implement a custom |
Yeah, I don't know why |
Looks like it's been around since forever (2007), with no explanation as to why the restriction on copying is there. Maybe I'll remove it and see what happens to the CI... |
I doubt it will universally work - the |
The issue with transforms is that they are a branching partially-cached call graph. That is the A way to implement deepcopy using things we do test is |
Thanks @tacaswell It seems to me that scales can use frozen transforms? i.e. they never depend on any children, do they? It seems to me that if a transform was frozen then copying it would be safe. But maybe I'm not following the subtleties. |
They could be frozen, but also, the transforms in I don't think scales use an transforms outside of the |
So |
So if I add def __deepcopy__(self, memo):
return self.frozen() to import matplotlib.scale as mscale
import copy
sc = mscale.LogScale(axis='x', base=10)
sc2 = copy.deepcopy(sc)
print(sc2)
print('deon')
import matplotlib.colors as mcolor
norm = mcolor.LogNorm()
norm.vmin = -10
norm2 = copy.deepcopy(norm)
print(norm2, norm2.vmin) returns what I'd expect:
I somewhat hesitate to do all this because the transform stack seems delicate, but given that |
I'm going to be a little bit greedy, and mark this as Release critical. I think plotting multiple images with the same norm, but automatically autoscaling each one (which requires individual (deep)copies of the original norm object), is a reasonable use case, and this is a regression from 3.3.x. |
I mean there is a PR for it. Does it not fix the problem? @tacaswell seemed to have some issue with it but I frankly did not fully understand what it was. |
Bug report
Bug summary
As of 0d1d95c on the master branch it is no longer possible to deepcopy a
LogNorm
instance, which is a regression from3.3.x
.Code for reproduction
Actual outcome
Expected outcome
No error
Matplotlib version
print(matplotlib.get_backend())
):The text was updated successfully, but these errors were encountered: