-
-
Notifications
You must be signed in to change notification settings - Fork 7.9k
FIX: get_datalim for collection #13642
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -146,6 +146,8 @@ def __init__(self, | |
self._joinstyle = None | ||
|
||
self._offsets = np.zeros((1, 2)) | ||
# save if offsets passed in were none... | ||
self._offsetsNone = offsets is None | ||
self._uniform_offsets = None | ||
if offsets is not None: | ||
offsets = np.asanyarray(offsets, float) | ||
|
@@ -179,9 +181,30 @@ def get_offset_transform(self): | |
return t | ||
|
||
def get_datalim(self, transData): | ||
|
||
# Get the automatic datalim of the collection. | ||
# | ||
# This operation depends on the transforms for the data in the | ||
# collection and whether the collection has offsets. | ||
# | ||
# 1) offsets = None, transform child of transData: use the paths for | ||
# the automatic limits (i.e. for LineCollection in streamline). | ||
# 2) offsets != None: offset_transform is child of transData: | ||
# a) transform is child of transData: use the path + offset for | ||
# limits (i.e for bar). | ||
# b) transform is not a child of transData: just use the offsets | ||
# for the limits (i.e. for scatter) | ||
# 3) otherwise return a null Bbox. | ||
|
||
transform = self.get_transform() | ||
transOffset = self.get_offset_transform() | ||
if (not self._offsetsNone and | ||
not transOffset.contains_branch(transData)): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I guess(?) you could get a few extra points by checking There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. let's just forget this case for now There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As discussed on gitter, I think thats best for simplicity sake. For instance, I don't know what the datalim handling does with infs (for instance) and adding this case is a good chunk of code. As also discussed, very happy to revisit if it turns out to be needed. |
||
# if there are offsets but in some co-ords other than data, | ||
# then don't use them for autoscaling. | ||
return transforms.Bbox.null() | ||
offsets = self._offsets | ||
|
||
paths = self.get_paths() | ||
|
||
if not transform.is_affine: | ||
|
@@ -196,13 +219,30 @@ def get_datalim(self, transData): | |
# get_path_collection_extents handles nan but not masked arrays | ||
|
||
if len(paths) and len(offsets): | ||
result = mpath.get_path_collection_extents( | ||
transform.frozen(), paths, self.get_transforms(), | ||
offsets, transOffset.frozen()) | ||
result = result.inverse_transformed(transData) | ||
else: | ||
result = transforms.Bbox.null() | ||
return result | ||
if transform.contains_branch(transData): | ||
# collections that are just in data units (like quiver) | ||
# can properly have the axes limits set by their shape + | ||
# offset. LineCollections that have no offsets can | ||
# also use this algorithm (like streamplot). | ||
result = mpath.get_path_collection_extents( | ||
transform.frozen(), paths, self.get_transforms(), | ||
offsets, transOffset.frozen()) | ||
return result.inverse_transformed(transData) | ||
if not self._offsetsNone: | ||
# this is for collections that have their paths (shapes) | ||
# in physical, axes-relative, or figure-relative units | ||
# (i.e. like scatter). We can't uniquely set limits based on | ||
# those shapes, so we just set the limits based on their | ||
# location. | ||
# Finish the transform: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. in the same spirit as #14845 this could be
(you also save a transform by doing so). There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I wasn't paying attention when #14845 went in. This requires remembering that transA - transB is the same as what I wrote here. I don't think it helps the readability/comprehensibility of all the transform magic to have arithmetic operators that do stuff versus just using explicit methods. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The main operator is trA+trB, which just means "perform trA, then perform trB"; once you know it it's quite natural that trA-trB means "perform trA, then perform the inverse of trB". You could write There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good point. Split the difference: |
||
offsets = (transOffset + | ||
transData.inverted()).transform(offsets) | ||
offsets = np.ma.masked_invalid(offsets) | ||
if not offsets.mask.all(): | ||
points = np.row_stack((offsets.min(axis=0), | ||
offsets.max(axis=0))) | ||
return transforms.Bbox(points) | ||
return transforms.Bbox.null() | ||
|
||
def get_window_extent(self, renderer): | ||
# TODO: check to ensure that this does not fail for | ||
|
@@ -1299,7 +1339,6 @@ def __init__(self, segments, # Can be None. | |
antialiaseds = (mpl.rcParams['lines.antialiased'],) | ||
|
||
colors = mcolors.to_rgba_array(colors) | ||
|
||
Collection.__init__( | ||
self, | ||
edgecolors=colors, | ||
|
Uh oh!
There was an error while loading. Please reload this page.