Thanks to visit codestin.com
Credit goes to github.com

Skip to content

FIX: Allow deepcopy on norms and scales #18126

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

jklymak
Copy link
Member

@jklymak jklymak commented Jul 30, 2020

PR Summary

Closes #18119

I guess its an API change to change the return type of scale.linear.get_transform?

This needs tests but the following now works instead of raising an error.

import matplotlib.scale as mscale
import copy

sc = mscale.LogScale(axis='x', base=10)
sc2 = copy.deepcopy(sc)

print(sc2)
print('deon')
import matplotlib.colors as mcolor
norm = mcolor.LogNorm()
norm.vmin = -10
norm2 = copy.deepcopy(norm)


print(norm2, norm2.vmin)
norm2.vmin = 3
print(norm2.vmin)

import matplotlib.pyplot as plt
import numpy as np
fig, ax = plt.subplots()
ax.plot(np.arange(10))
plt.show()

PR Checklist

  • Has Pytest style unit tests
  • Code is Flake 8 compliant
  • New features are documented, with examples if plot related
  • Documentation is sphinx and numpydoc compliant
  • Added an entry to doc/users/next_whats_new/ if major new feature (follow instructions in README.rst there)
  • Documented in doc/api/next_api_changes/* if API changed in a backward-incompatible way

@jklymak jklymak added Release critical For bugs that make the library unusable (segfaults, incorrect plots, etc) and major regressions. topic: transforms and scales labels Jul 30, 2020
@jklymak jklymak added this to the v3.3.1 milestone Jul 30, 2020
@tacaswell
Copy link
Member

This seems like too heavy handed fix? We are trying to fix the the deep-copy norms but are enabling deep copying all of the scales. The scales participate in the transform stack getting from data -> screen so I'm not sure that making just some of the things in the tree deep-copy-able is the right choice.

@jklymak
Copy link
Member Author

jklymak commented Jul 30, 2020

Well, the problem is not deep copying norms, per-se, its deep copying norms that are derived from scales, which is currently just LogNorm and SymmetricLogNorm.

The point is that the scales are at the end of the transform tree, so locking their transforms out of deepcopy is not necessary. I don't think that resizing the figure or zooming the axes is going to change the scale's Transform? Is there someway this can break?

The hack-ier way to fix this is to write a __deepcopy__ for the norms that just makes a new norm with the internal state of the old norm. Thats less elegant, in that it bandaids the problem, but happy to take that approach instead if that is what is preferred.

@jklymak jklymak marked this pull request as ready for review July 30, 2020 15:39
@dstansby
Copy link
Member

Haven't had a chance to look at this yet, but I think this should be targeted to 3.4 (which #16457) is targeted for and not 3.3.

@dstansby dstansby modified the milestones: v3.3.1, v3.4.0 Jul 30, 2020
@brianpenghe
Copy link

I don't know whether this is related but I had this issue

@jklymak
Copy link
Member Author

jklymak commented Dec 8, 2020

@tacaswell can you weigh in again on this if you are still concerned?

@anntzer
Copy link
Contributor

anntzer commented Dec 8, 2020

Looking at things again (it's been a while), I am actually slightly confused why copying and deepcopying of transforms is forbidden in general (perhaps I completely missed the point, in which case I apologize in advance). In #18119 (comment) @tacaswell wrote

The issue with transforms is that they are a branching partially-cached call graph. That is the ax.transData both has a cached version of the transform from data space -> physical screen space, it knows that it is a compound of the figure transfrom, the dpi, transform, and the axes transform. If any of it's parents get changed it is notified that it has to be re-computed. I suspect that the number of links in that chain is why it is forbidden (along with the theory that it is better to not allowing something we don't need internally than it is to expose a wrong functionality).

but actually AFAICT it is the modified node that is in charge of notifying its dependents ("parents") that it has been modified and that they (the dependents) need to be invalidated. IOW, if c = a + b (i.e. c = CompositeGenericTransform(a, b), then if a gets modified, then we call a.invalidate() which itself calls c.invalidate() (or rather c._invalidate_internal()). So if we make a copy of c (c1 = copy(c)), that's the same as c1 = a + b, which will also register c1 among a's dependents and it will likewise get invalidated whenever a gets modified. If we instead make a deepcopy of c then we also create new versions of a and b so things are even simpler (invalidation of the two trees are unrelated). I guess the more interesting case is if we make a copy a1 of a: in that case c should not be counted as a dependent of a1 (*), but that just means copying should clear the _parents list?

(*) that's the normal semantics of copying: if you have a = [1, 2]; b = {"a": a}; c = copy.copy(a), then modifying c does not affect b (whereas modifying a does modify b).

@anntzer anntzer removed their request for review December 8, 2020 18:22
@jklymak jklymak force-pushed the fix-make-norms-scales-copy branch 4 times, most recently from ed82fab to 772e1a2 Compare January 5, 2021 22:05
@@ -79,6 +79,18 @@ def limit_range_for_scale(self, vmin, vmax, minpos):
return vmin, vmax


class _TransformScaleCopyable():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
class _TransformScaleCopyable():
class _CopyableTransfromMixin():
"""Mixin to support deep copy on a transfrom."""

I don't think we need "scale" in here. It's just the copy feature that we provide and doesn't have to say anything on who may use it.

While historically, we did not call mixins *Mixin, I think it's good style to do so, and there is precedence with _BlendedMixin.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure, plus or minus spelling ;-)

@jklymak jklymak force-pushed the fix-make-norms-scales-copy branch from 772e1a2 to f7baacb Compare January 5, 2021 22:20
@QuLogic
Copy link
Member

QuLogic commented Jan 5, 2021

That unfortunately broke docs since the mixin became undocumented.

@jklymak
Copy link
Member Author

jklymak commented Jan 5, 2021

I guess there is no fundamental reason for it not to be public?

@jklymak jklymak force-pushed the fix-make-norms-scales-copy branch from f7baacb to 01049f1 Compare January 5, 2021 23:33
Copy link
Contributor

@brunobeltran brunobeltran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Flagging this because it doesn't match my understanding of what we agreed was best on the call today. Worst case I'm wrong, but better safe than sorry.

That said it's a pretty simple fix.

Comment on lines 93 to 94
class LinearTransform(CopyableTransformMixin, IdentityTransform):
pass
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry to nitpick about this, but I think it makes a lot more sense to instead have

class IdentityTransform(CopyableTransformMixin, Transform)

in transform.py. Then you can just return IdentityTransform below and not have to create the LinearTransform class at all. That was my take away from the call today, but @timhoffm approved already so I may be missing something?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the difference between this and doing it in scale.py where it is needed? We really don't particularly want people using a special version of transform.Identity do we?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I emphasized the wrong thing. the point is to not make a special Transform class at all, but instead to add copy and deepcopy ability to the real IdentityTransform (using the mix-in). Then you don't need the extra class at all, I think.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure I understand. But my point is that all the other scales have a named transform associated with them, I think its fine for the linear scale to have a (trivial) LinearTransform that is associated with it. The advantage is that the goofy mixin is confined to scale.py, and we don't pollute transform.py.

The discussion about IdentityTransform on the call was pointing out that it definitely didn't need to have a stricture against copying, but to my mind that just goes to @anntzer 's overall point that copying all the transforms could be reconsidered. But that is a 3.5 sort of thing, not a release-critical 3.4 fix.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with @brunobeltran (@brunobeltran thanks for catching this. I was too concerned with naming the mixin 😄).

We have scales and transforms that are associated with it. For nonlinear scales we need special transforms and it's natural that a LogScale is connected to a LogTransform. However, for a linear scale we don't need a transform (or only the identity). Making a LinearTransform, which is effectively an IdentityTransform only for the sake of matching the name is more confusing than helpful (that's also what got me confused in the first place in yesterdays discussion).

Note also the docstring of LinearScale.get_transform():

   Return the transform for linear scaling, which is just the
   `~matplotlib.transforms.IdentityTransform`.

Given that te mixin is private again, we're not polluting the namespace of transform.py if we put it there.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure but it is strange to have a mixin that we use once in the transform module and a bunch of times in the scale module. Particularly when the mixin is really only needed for the scales to work. I think it just makes things unnecessarily complicated.

Copy link
Contributor

@brunobeltran brunobeltran Jan 7, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think which is more complicated is in the eye of the beholder. I personally find the purpose of the mix-in to be clear from the name. On the other hand, an equivalent solution (that it sounds like you like better) would be to just have:

(in transforms.py):

class _CopyableTransformMixin():
    ...

class IdentityTransform(Affine2DBase, _CopyableTransformMixin):
    ...

class _CopyableTransform(Transform, _CopyableTransformMixin):
    pass

then in scale.py, we simply do

from matplotlib.transforms import ..., _CopyableTransform

class FuncTransform(_CopyableTransform):
    ...

Then there's not a mixin scattered over two files, the semantics of what's going on are confined to just transforms.py, and what we get is that IdentityTrasnform now copies correctly, and no users are confused about what this "LinearTransform" object is that quacks like IdentityTransform but has a different name.

(Bonus) whoever ends up making all Transforms copy-able in the future has a nice framework to start working within.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the current solution better where nothing changes in transform.py and all the changes are in scale.py.

I don't think this is the correct framework for any fundamental changes to Transform that allow it to be copyable in general. That will require substantial more state checking etc that are beyond a simple mixin.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to keep this with no additional public API for now (other than supporting the deepcopy on norms and scales). In particular, I don't want to introduce a funky LinearTransform that is basically an IdentityTransform.

That implies we have to make IdentityTransform itself copyable, i.e. there has to be a change to transform.py. Thogh it can all be private. I have no strong preference concerning mixin into all scale transforms vs. a _CopyableTransform base class (partly also because it can all be kept private and so we can change that should the need arise).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made IdentityTranform copyable by adding a __deepcopy__ (and __copy__) to it, and removed LinearTransform from transform.py

@jklymak jklymak requested a review from brunobeltran January 6, 2021 18:59
@timhoffm timhoffm self-requested a review January 7, 2021 23:33
@jklymak jklymak force-pushed the fix-make-norms-scales-copy branch 2 times, most recently from 713e675 to 4699c0c Compare January 8, 2021 16:39
@jklymak
Copy link
Member Author

jklymak commented Jan 8, 2021

I don't think the test failures are this PR. I can rerun after review

Copy link
Member

@timhoffm timhoffm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Modulo the style comment.

@jklymak jklymak force-pushed the fix-make-norms-scales-copy branch 3 times, most recently from 0cac039 to a7439b6 Compare January 9, 2021 17:32
@jklymak jklymak force-pushed the fix-make-norms-scales-copy branch from a7439b6 to e152f02 Compare January 10, 2021 01:39
@jklymak
Copy link
Member Author

jklymak commented Jan 11, 2021

@brunobeltran you are still blocking this. I believe your concerns are addressed.

@anntzer
Copy link
Contributor

anntzer commented Jan 12, 2021

Sorry for being late to the party, but perhaps #19281 would be more general?

@jklymak jklymak marked this pull request as draft January 12, 2021 21:38
@jklymak
Copy link
Member Author

jklymak commented Jan 12, 2021

Put on hold until #19281 gets decided.

@jklymak
Copy link
Member Author

jklymak commented Jan 13, 2021

@anntzer does it make sense to get this in for 3.4 and then leave your more comprehensive change for a future release? I can only imagine testing it is going to be a bit of a challenge.

@anntzer
Copy link
Contributor

anntzer commented Jan 13, 2021

I'll try to join the call tomorrow and we can discuss then?

@timhoffm
Copy link
Member

Superseeded by #19281.

@timhoffm timhoffm closed this Jan 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Release critical For bugs that make the library unusable (segfaults, incorrect plots, etc) and major regressions. topic: transforms and scales
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Can no longer deepcopy LogNorm objects on master
8 participants