Thanks to visit codestin.com
Credit goes to github.com

Skip to content

MAINT: Use vectorization in plot_trisurf, simplifying greatly #9991

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Feb 3, 2018

Conversation

eric-wieser
Copy link
Contributor

Cleanup only, no behavior changes.

I'd expect considerably better performance on large arrays, since there are fewer copies being made, and the operations can be vectorized

@eric-wieser eric-wieser force-pushed the normals-cleanup branch 2 times, most recently from 5d63231 to 5e1b08b Compare December 13, 2017 07:41
@eric-wieser eric-wieser changed the title MAINT: Use broadcasting to compute surface normals, rather than manual looping MAINT: Use vectorization in plot_trisurf, simplifying greatly Dec 13, 2017
Copy link
Contributor Author

@eric-wieser eric-wieser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

codecov/patch is picking up the fact that shade=True was not tested before this patch, and that all the lines that were tested have been made a minority due to being combined into one.

if vmin is not None or vmax is not None:
polyc.set_clim(vmin, vmax)
if norm is not None:
polyc.set_norm(norm)
else:
if shade:
v1 = verts[:,0,:] - verts[:,1,:]
v2 = verts[:,1,:] - verts[:,2,:]
normals = np.cross(v1, v2)
Copy link
Contributor Author

@eric-wieser eric-wieser Dec 13, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I combined these into one less-readable line, then your code coverage tests would pass. Aren't metrics great?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other way of looking at that is the coverage tests would pass if this bit of code was tested!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not as if we really cared about what codecov says...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dstansby: I suppose the way of looking at it is not "here's a dice that's rolled which will randomly reject your PR", but "here's a dice that when rolled might let you off a test".

I've added the test now :)

@tacaswell tacaswell added this to the v2.2 milestone Dec 13, 2017
Copy link
Member

@QuLogic QuLogic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are a couple PEP8 issues, though because it's disabled for this (extremely inconsistent) file, there are no complaints from Travis, but that doesn't mean we can't fix them when modifying the file.

Otherwise, I think this PR is good.

colset = np.array(colset)
polyc.set_array(colset)
# average over the three points of each triangle
avg_z = verts[:,:,2].mean(axis=1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Space after comma.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I always seem to go with omitting them in indexing expressions. But I guess PEP8 doesn't call out indexing with a tuple specifically, and if that's the style matplotlib is (trying to be) using, then I'll go with it.

if vmin is not None or vmax is not None:
polyc.set_clim(vmin, vmax)
if norm is not None:
polyc.set_norm(norm)
else:
if shade:
v1 = verts[:,0,:] - verts[:,1,:]
v2 = verts[:,1,:] - verts[:,2,:]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Space after comma.

@eric-wieser
Copy link
Contributor Author

@QuLogic: Updated

@QuLogic
Copy link
Member

QuLogic commented Dec 14, 2017

This seems a bit bigger than when I first reviewed it?

normals = np.cross(v1, v2)
else:
normals = []
# verts = np.stack((xt, yt, zt), axis=-1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are they 1D or 2D? np.vstack or np.dstack could also be used in those cases.

Copy link
Contributor Author

@eric-wieser eric-wieser Dec 14, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, this was a new change since you last reviewed. Both vstack and dstack have undesirable semantics of guessing the array .ndim, so using concatenate is more precise.

This isn't the only place that np.stack is mentioned in a comment above a np.concatenate((a[..., None], ...)), so might be worth backporting to the very simple

def stack(arrays, axis):
    return np.concatenate([arr[..., np.newaxis] for arr in arrays], axis)

- (oops, not correct)

Not something I want to involve in this PR though

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can just stick the definition in https://github.com/matplotlib/matplotlib/blob/master/lib/matplotlib/cbook/_backports.py if you change your mind.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, and to answer your question, 2D - (N, edges_in_triangle)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do have a cbook._backports; not sure how strong the need is for it without having done any looking.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reckon there're a substantial number of cases - I can put together a PR once this is merged

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking again, a stack_last function is all that ever seems to be needed, implemented as the above - so that wouldn't belong in _backports. Suggestions of where to put such a helper?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cbook would be the place.
But given that we'd probably just be using stack(..., -1) "if it was available", I'd just backport stack.
Let us know if you want to do this in this PR or a separate one.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely a separate one

Use it to compute surface normals and mean z coord, rather than manual looping
Copy link
Contributor

@anntzer anntzer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quick skim looks fine, second pair of eyes (@QuLogic? :-)) would be good.
edit: Ah, you already approved it.

@anntzer anntzer merged commit 2ed292d into matplotlib:master Feb 3, 2018
@QuLogic QuLogic modified the milestones: needs sorting, v2.2.0 Feb 12, 2018
@eric-wieser eric-wieser deleted the normals-cleanup branch March 23, 2019 18:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants