-
-
Notifications
You must be signed in to change notification settings - Fork 7.9k
MAINT: Use vectorization in plot_trisurf, simplifying greatly #9991
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
5d63231
to
5e1b08b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
codecov/patch
is picking up the fact that shade=True
was not tested before this patch, and that all the lines that were tested have been made a minority due to being combined into one.
lib/mpl_toolkits/mplot3d/axes3d.py
Outdated
if vmin is not None or vmax is not None: | ||
polyc.set_clim(vmin, vmax) | ||
if norm is not None: | ||
polyc.set_norm(norm) | ||
else: | ||
if shade: | ||
v1 = verts[:,0,:] - verts[:,1,:] | ||
v2 = verts[:,1,:] - verts[:,2,:] | ||
normals = np.cross(v1, v2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I combined these into one less-readable line, then your code coverage tests would pass. Aren't metrics great?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The other way of looking at that is the coverage tests would pass if this bit of code was tested!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not as if we really cared about what codecov says...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dstansby: I suppose the way of looking at it is not "here's a dice that's rolled which will randomly reject your PR", but "here's a dice that when rolled might let you off a test".
I've added the test now :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are a couple PEP8 issues, though because it's disabled for this (extremely inconsistent) file, there are no complaints from Travis, but that doesn't mean we can't fix them when modifying the file.
Otherwise, I think this PR is good.
lib/mpl_toolkits/mplot3d/axes3d.py
Outdated
colset = np.array(colset) | ||
polyc.set_array(colset) | ||
# average over the three points of each triangle | ||
avg_z = verts[:,:,2].mean(axis=1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Space after comma.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I always seem to go with omitting them in indexing expressions. But I guess PEP8 doesn't call out indexing with a tuple specifically, and if that's the style matplotlib is (trying to be) using, then I'll go with it.
lib/mpl_toolkits/mplot3d/axes3d.py
Outdated
if vmin is not None or vmax is not None: | ||
polyc.set_clim(vmin, vmax) | ||
if norm is not None: | ||
polyc.set_norm(norm) | ||
else: | ||
if shade: | ||
v1 = verts[:,0,:] - verts[:,1,:] | ||
v2 = verts[:,1,:] - verts[:,2,:] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Space after comma.
5e1b08b
to
f0077d2
Compare
@QuLogic: Updated |
f0077d2
to
cb05203
Compare
This seems a bit bigger than when I first reviewed it? |
normals = np.cross(v1, v2) | ||
else: | ||
normals = [] | ||
# verts = np.stack((xt, yt, zt), axis=-1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are they 1D or 2D? np.vstack
or np.dstack
could also be used in those cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, this was a new change since you last reviewed. Both vstack and dstack have undesirable semantics of guessing the array .ndim
, so using concatenate
is more precise.
This isn't the only place that np.stack
is mentioned in a comment above a np.concatenate((a[..., None], ...))
, so might be worth backporting to the very simple
def stack(arrays, axis):
return np.concatenate([arr[..., np.newaxis] for arr in arrays], axis)
- (oops, not correct)
Not something I want to involve in this PR though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can just stick the definition in https://github.com/matplotlib/matplotlib/blob/master/lib/matplotlib/cbook/_backports.py if you change your mind.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, and to answer your question, 2D - (N, edges_in_triangle)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do have a cbook._backports
; not sure how strong the need is for it without having done any looking.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reckon there're a substantial number of cases - I can put together a PR once this is merged
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking again, a stack_last
function is all that ever seems to be needed, implemented as the above - so that wouldn't belong in _backports
. Suggestions of where to put such a helper?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cbook would be the place.
But given that we'd probably just be using stack(..., -1) "if it was available", I'd just backport stack.
Let us know if you want to do this in this PR or a separate one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely a separate one
cb05203
to
100f167
Compare
Use it to compute surface normals and mean z coord, rather than manual looping
100f167
to
e36bb0b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
quick skim looks fine, second pair of eyes (@QuLogic? :-)) would be good.
edit: Ah, you already approved it.
Cleanup only, no behavior changes.
I'd expect considerably better performance on large arrays, since there are fewer copies being made, and the operations can be vectorized