Thanks to visit codestin.com
Credit goes to github.com

Skip to content

MAINT/BUG: Don't use 5-sided quadrilaterals in Axes3D.plot_surface #10001

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jul 9, 2018

Conversation

eric-wieser
Copy link
Contributor

@eric-wieser eric-wieser commented Dec 14, 2017

Previously:

  • "cell" perimeters were clumsily calculated with duplicates, which were then removed at runtime
  • code to calculate normals was spread into multiple places
  • average z was calculated even if not used
  • repeated conversion between stride and count was done

Should have no visible behavior changes

This should be up to 25% faster, since the quadrilateral polygons now have 4 points, rather than 5 (!)

ztop = a[rs, cs:cs_next ]
zleft = a[rs:rs_next, cs_next ]
zbase = a[rs_next, cs_next:cs:-1]
zright = a[rs_next:rs:-1, cs ]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the first time I've ever found myself using a:b:-1 and it doing exactly what I wanted!

for poly_i, ps in enumerate(polys):
# pick three points around the polygon to find the normal at
# hard to vectorize, since len(ps) is different at the edges
i1, i2, i3 = 0, int(len(ps)/3), int(2*len(ps)/3)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like there's a much better approach here, but this is at least cheap, I suppose. Taking the middle point or even mean of each edge, and taking the cross product of the vectors joining opposite edges would be be a marginally better heuristic.

Either way, this is just cleanup, so I'm leaving it as is.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that if I want to make the image output unchanged, then this needs to become (len(ps) + 1). Is per-pixel compatibility something that is cared about?


# evenly spaced, and including both endpoints
row_inds = list(xrange(0, rows-1, rstride)) + [rows-1]
col_inds = list(xrange(0, cols-1, cstride)) + [cols-1]


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove extra blank line.



#colset contains the data for coloring: either average z or the facecolor
#colset contains the sampled facecolors
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Space after #.

zbase = a[min(rows-1, rs+rstride), cs:min(cols, cs+cstride+1):][::-1]
zright = a[rs:min(rows-1, rs+rstride):, cs][::-1]
# the edges of the projected quadrilateral
# note we use pythons half-open ranges to avoid repeating
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Python's

# the edges of the projected quadrilateral
# note we use pythons half-open ranges to avoid repeating
# the corners
ztop = a[rs, cs:cs_next ]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spacing is against PEP8.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adds greatly to the clarity here though, so I'd maybe make an exception

avgzsum = sum(p[2] for p in ps2)
polys.append(ps2)
# ps = np.stack(ps, axis=-1)
ps = np.array(ps).T
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if there's some way of pre-allocating this with the lengths of row_inds and col_inds.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, because ps varies in length throughout the iteration

Copy link
Member

@QuLogic QuLogic Dec 14, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Err, I might have had the wrong loop; the lengths of bits sliced out of X, Y, Z are not known?

Copy link
Contributor Author

@eric-wieser eric-wieser Dec 14, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. They correspond the perimeters of the "square" "cells", which are not always the same size - the cells look like:

+---+---+-+
|   |   | |
|   |   | |
+---+---+-+
|   |   | |
|   |   | |
+---+---+-+
|   |   | |
+---+---+-+

where the trailing edges contain non-uniformly sized cells. Perhaps we could vectorize across the uniform ones, and do the rest separately, but that's more work.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we're talking about different things; I mean ps itself which should be uniform since you're calling np.array on it, no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see. Yes, we could preallocate to (2*(rs_next - rs) + 2*(cs_next - cs), 3)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, here's the answer - preallocating doesn't help, because concatenate has no out argument until numpy 1.14

# pick three points around the polygon to find the normal at
# hard to vectorize, since len(ps) is different at the edges
i1, i2, i3 = 0, int(len(ps)/3), int(2*len(ps)/3)
v1[poly_i] = ps[i1] - ps[i2]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer to be explicit with known 2D things, i.e., v1[poly_i, :] = ps[i1, :] - ps[i2, :]; I don't think there's any penalty for it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a small penalty associated with parsing tuple indices over scalar indices, but I suppose it's probably not significant.

@eric-wieser
Copy link
Contributor Author

Looks like I got something wrong here...

@eric-wieser
Copy link
Contributor Author

Style fixes made, will investigate failures once I get my test setup working again

@eric-wieser
Copy link
Contributor Author

@QuLogic: Worked out why it was failing - previously, we were plotting every quadrilateral with 5 points, which changed the normal vector! I've updated the images, and added another test that makes shading more obvious, and tests striding.

# are removed here.
ps = list(zip(*ps))
lastp = np.array([])
ps2 = [ps[0]] + [ps[i] for i in xrange(1, len(ps)) if ps[i] != ps[i-1]]
Copy link
Contributor Author

@eric-wieser eric-wieser Dec 15, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This didn't detect the fact that ps[-1] == ps[0], which was always true

@eric-wieser eric-wieser changed the title MAINT: Simplify logic in Axes3D.plot_surface MAINT/BUG: Don't use 5-sided quadrilaterals in Axes3D.plot_surface Dec 15, 2017
@eric-wieser
Copy link
Contributor Author

eric-wieser commented Dec 15, 2017

Failures are PEP8 failures - one line that's 80 instead of 79, and then the extra spacing that I argue is worth keeping

@eric-wieser
Copy link
Contributor Author

eric-wieser commented Mar 16, 2018

Ping.

The failure is the following PEP8 violation in these lines

/home/travis/build/matplotlib/matplotlib/lib/mpl_toolkits/mplot3d/axes3d.py:1686:25: E221 multiple spaces before operator
                    ztop   = a[rs,             cs:cs_next   ]
                        ^
/home/travis/build/matplotlib/matplotlib/lib/mpl_toolkits/mplot3d/axes3d.py:1686:60: E202 whitespace before ']'
                    ztop   = a[rs,             cs:cs_next   ]
                                                           ^
/home/travis/build/matplotlib/matplotlib/lib/mpl_toolkits/mplot3d/axes3d.py:1687:26: E221 multiple spaces before operator
                    zleft  = a[rs:rs_next,     cs_next      ]
                         ^
/home/travis/build/matplotlib/matplotlib/lib/mpl_toolkits/mplot3d/axes3d.py:1687:60: E202 whitespace before ']'
                    zleft  = a[rs:rs_next,     cs_next      ]
                                                           ^
/home/travis/build/matplotlib/matplotlib/lib/mpl_toolkits/mplot3d/axes3d.py:1688:26: E221 multiple spaces before operator
                    zbase  = a[rs_next,        cs_next:cs:-1]
                         ^
/home/travis/build/matplotlib/matplotlib/lib/mpl_toolkits/mplot3d/axes3d.py:1689:60: E202 whitespace before ']'
                    zright = a[rs_next:rs:-1,  cs           ]
  1. Does this seem like a reasonable exception to spacing rules
  2. Do I need to mark it as such?

@anntzer
Copy link
Contributor

anntzer commented Mar 16, 2018

I'm fine with putting a noqa.

@eric-wieser eric-wieser force-pushed the more-normals branch 2 times, most recently from ca04037 to 866135c Compare March 16, 2018 23:45
@eric-wieser
Copy link
Contributor Author

eric-wieser commented Mar 16, 2018

Updated with:

  • A helper function to enable a list comprehension
  • Flake8 ignore
  • Much shorter lines to make alignment no longer needed to make things readable

@eric-wieser eric-wieser force-pushed the more-normals branch 2 times, most recently from 1a734d8 to bbc417e Compare March 16, 2018 23:56
@@ -1684,65 +1684,69 @@ def plot_surface(self, X, Y, Z, *args, **kwargs):
if shade and cmap is not None and fcolors is not None:
fcolors = self._shade_colors_lightsource(Z, cmap, lightsource)

def boundary_edge(arr):
"""
Get the boundary elements of the rectangle ``arr``,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Happy to rename / extract this function if someone else can suggest a name / location.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

heh, I might call it "nibble()", because it nibbles around the edge of the 2D array. At the very least, I would expand the explanation a bit to get that point across. It took me a few minutes to realize that was what was done.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not a fan of nibble. How about one of

[array_](perimeter|edge|boundary|border)[_(elements|values)]

I think I like array_perimeter most.

Can you suggest a top-level place to put this in?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved to cbook, and added an example

Copy link
Member

@WeatherGod WeatherGod left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code is cleaner than before, which is definitely an improvement. Might be nice to figure out if there is a way to keep the arrays from getting ragged, but I don't think that is possible by the nature of the problem.

There still remains a little bit more to do here, particularly fixing the py3k issue.


# evenly spaced, and including both endpoints
row_inds = list(xrange(0, rows-1, rstride)) + [rows-1]
col_inds = list(xrange(0, cols-1, cstride)) + [cols-1]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this isn't py3k compatible. xrange doesn't exist, right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe np.array_split() would be cleaner? Don't know when it was added to numpy, but I do see it in v1.12.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

array_split produces non-overlapping chunks, but an interesting idea.

I assume that there was xrange = range or something, but I guess not. I'll switch to range

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like I this came in after rebasing on #10525. Does matplotlib no longer support python 2?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not on master, it doesn't.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Am I right to target this against master?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can always backport if necessary (I don't think it is).

@@ -1684,65 +1684,69 @@ def plot_surface(self, X, Y, Z, *args, **kwargs):
if shade and cmap is not None and fcolors is not None:
fcolors = self._shade_colors_lightsource(Z, cmap, lightsource)

def boundary_edge(arr):
"""
Get the boundary elements of the rectangle ``arr``,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

heh, I might call it "nibble()", because it nibbles around the edge of the 2D array. At the very least, I would expand the explanation a bit to get that point across. It took me a few minutes to realize that was what was done.

@eric-wieser
Copy link
Contributor Author

@WeatherGod: Updated, tests now all pass

@eric-wieser
Copy link
Contributor Author

@WeatherGod: Anything else needed here?

@@ -2027,3 +2066,4 @@ def _setattr_cm(obj, **kwargs):
delattr(obj, attr)
else:
setattr(obj, attr, orig)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like a PEP8 check is not liking the blank line here? I thought we were supposed to have a blank line at the end of files?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A new line, not a full blank line.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I couldn't tell that with the github diffs. I like how git diff shows you those trailing blank lines.

Copy link
Contributor Author

@eric-wieser eric-wieser May 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bad merge I guess - will fix

i1, i2, i3 = 0, len(ps)//3, 2*len(ps)//3
v1[poly_i, :] = ps[i1, :] - ps[i2, :]
v2[poly_i, :] = ps[i2, :] - ps[i3, :]
normals = np.cross(v1, v2)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need an else: normals = [] so that "normals" is always defined for the code blocks ahead.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this happens in any real cases, but will fix instead of trying to work out which cases.

Better would be a local function that's only called when the result is needed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting to [] is an (existing) bug too, which I guess caused normals to be ignored when face colors were specified.

@WeatherGod
Copy link
Member

Besides those two points, I think this is ready to go. It is certainly easier to understand than before.

Previously:
* "cell" perimeters were clumsily calculated with duplicates, which were then (badly) removed at runtime. As a result, every quadrilateral was drawn with 5 vertices!
* code to calculate normals was spread into multiple places
* average z was calculated even if not used
* normals were sometimes not calculated even when needed
* repeated conversion between stride and count was done

This will affect shading of plots very slightly, hence the image tests changing in this commit.

Adds a `cbook._array_perimeter` function for use here.
@eric-wieser
Copy link
Contributor Author

@WeatherGod: All fixed - thanks for catching that normals bug

@eric-wieser
Copy link
Contributor Author

eric-wieser commented May 12, 2018

@WeatherGod: Starting to undergo bitrot here. I've merged and solved conflicts again, but I'd rather not keep having to


def get_normals(polygons):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just noticed the existing _generate_normals() further down. We should make sure we don't have such duplication of near-similar things without at least justifying the duplication. Or figure out how to get rid of the duplicate logic.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd argue that that's out of scope for this PR - I've not introduced that duplication, I just made it clearer that it exists (which makes it easier for someone else to remove later). Of course, that someone else might be me, but I'd rather do it in a separate unit of work.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fair enough

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In particular, deciding which version to keep (assemble then cross vs cross then assemble) would require some profiling

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is addressed in #12136

@WeatherGod
Copy link
Member

it is amazing how much cleaner this code has gotten over the revisions!

@eric-wieser
Copy link
Contributor Author

Once this goes in, I'll pick back up #9990

@eric-wieser
Copy link
Contributor Author

@QuLogic, @anntzer, is there anything else you'd like to see here (that wouldn't fit better in a future PR)?

@anntzer
Copy link
Contributor

anntzer commented Jun 19, 2018

(I haven't actually followed the logic of the code, so I'll leave the review to others for now.)

@tacaswell tacaswell added this to the v3.0 milestone Jun 19, 2018
Copy link
Member

@jklymak jklymak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving largely based on @WeatherGod approval and the fact that it passes the tests, and doesn't introduce any obvious bugs.

@jklymak jklymak merged commit 2887c49 into matplotlib:master Jul 9, 2018
@eric-wieser
Copy link
Contributor Author

Thanks! I'll get back to #9990 in the next few weeks

@eric-wieser eric-wieser deleted the more-normals branch March 23, 2019 18:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants