-
-
Notifications
You must be signed in to change notification settings - Fork 8k
Description
Problem
Currently BoundaryNorm
requires ncolors
as an argument, and hence the user needs to know the colormap size to use it.
Then we have this comment in the code
# Mapping to the 0-1 interval could have been done via piece-wise linear
# interpolation, but using integers seems simpler, and reduces the number
# of conversions back and forth between int and float.
and this "note" in the docstring:
"""
If there are fewer bins (including extensions) than colors, then the
color index is chosen by linearly interpolating the ``[0, nbins - 1]``
range onto the ``[0, ncolors - 1]`` range, effectively skipping some
colors in the middle of the colormap.
"""
If we are going to interpolate anyway, we should just map from 0-1 with len(boundaries)
boundaries like all the other norms. Then the user doesn't need to specify Ncolors. In fact, if the user mispecifies the length of the colormap, then the mapping will mysteriously be incorrect. (Yes there is a fast-path of len(boundaries)-1 == ncolors, but I really don't think np.digitize
is one of our performance chokepoints)
Proposed solution
The problem with changing this is tracking down the special cases, and possible with downstream libraries that do something with the Norm. The other problem is somehow deprecating the argument.