-
-
Notifications
You must be signed in to change notification settings - Fork 7.9k
[Sprint] Matlab fplot #1143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Sprint] Matlab fplot #1143
Conversation
@dmcdougall: We want to achieve a set of points that 'look nice', and to sample the function more where its behavior is more complicated. First of all, 'looking nice' means that all the data will be rescaled to fit into the plot, and that automatically means that absolute tolerance is not too useful. Instead both x-coordinate and y-coordinate should be scaled to fit a 1x1 square. The scaling should be recalculated when new data points are added, and we expect singularities, we may limit the total scale to some hundred times the initial scale, and not plot any points that do not fit into the maximal scale, because these are due to a singularity. Another thing, which I consider important is that rather than gradients, we see angles. For a human a sequence of points I think the preferred data structure for this is list rather than array, since it's much easier to insert entries inside a list using EDIT: It turns out that list item insertion is O(N), and so an ordered set (http://code.activestate.com/recipes/576696/) is a better data structure. |
@akhmerov Thanks for your suggestions, I'm really glad someone a little more experienced in this area has looked at the algorithm. Firstly, re-scaling to fit into the unit square is a good option, though I have accounted for this when taking the derivative. The minimal step size you should use grows as the abscissae grow, this is a numerical consideration and I think that both of us are solving the same problem with two different methods. I take your point, though. Secondly, yes absolute error is not ideal. As you point out, relative error would be better but I seem to recall (when I started this a while ago) that I had some problems with relative error: I'll look into it again. Thirdly, your point on angles rather than derivatives -- I think -- a great idea. Derivatives look different at different scales. However, angles do not capture regularity of a function. As a consequence, we may 'miss' certain more ill-behaved aspects of the function we wish to plot. This raises the question, "Should Lastly, regarding data structure, checking only the previously modified abscissae would boost performance, I agree. Out of interest, what version of Python were sets introduced? Matplotlib supports python version 2.6 and upwards. So we'd have to be careful here... Thanks again, for your comments. It's people like you who test change-sets and give advice that make software better. I appreciate you taking the time to do that. |
I don't think that taking derivative is sufficient. It does transform everything to the dimension (y_max - y_min)/(x_max - x_min), but that scale isn't bounded by any value. Consider exponent as an example, plotted on an interval 0 to 20. Your algorithm will encounter derivative larger than 1e6 on each of the segments that involve coordinates larger than 13, and will consider each of these intervals singular. At the same time, if you were rescaling, there would be nothing ill-behaved.
If you want to sample the function more where it changes rapidly, you can also make a threshold for the rescaled length of each interval.
Sets were introduced in their module in python 2.3, and migrated to built-ins in 2.6. |
import numpy as np | ||
import matplotlib | ||
|
||
class FPlot: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should subclass object.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nudge.
On Sun, Aug 26, 2012 at 08:35:34AM -0700, Anton Akhmerov wrote:
Ah ok, good point. Continuous functions don't necessarily have a bounded Another question, let's say I scale the x-coordinates to be between 0
Thanks for checking this! Damon McDougall |
@dmcdougall: As I wrote earlier, a way to deal with unbounded functions, in my opinion, is to limit y-scale to e.g. a 100 times initially sampled y-scale. One can do cleverer things, like seeing that the plotted information is maximized. This, however, would have a more general usability for determining the plotting limits in any plot and it would require understanding what's a good definition of amount of information read out from a plot. Anton |
Needs a little bit of a polish, some tests, and a fan-fare (documentation), but other than that, I'm behind this - nice feature! |
@pelson I think @akhmerov's suggestions here are too good to pass up. We have two options on pushing this forward: a) we can merge this now (with polish, tests, and docs) and work on implementing @akhmerov's suggestions incrementally after the fact; or b) I can work @akhmerov's suggestions directly into this PR and have the whole lot merged. I think I prefer b), since we'll be merging higher quality code. I'd rather have it work right and to an extremely high quality from the get go. I think 1.3.x is still a reasonable target in either case. |
I'm fine with it either way - the merge and improve approach is equally valid in my eyes, but whichever you're most comfortable with is great (it's your feature after-all!). |
@dmcdougall: Go or no go for v1.3? There is always v1.4... 😄 |
@dmcdougall: I'm going to move the milestone on this to 1.4.x, since that's what I'm gathering from the discussion. Looking forward to this feature! |
@dmcdougall as your wanting to get some more development on this PR done, I suggest we close the PR until its ready for more attention/review. My only concern with doing that would be that it is easy to forget about it (which I do not want to do). Perhaps we should keep a wiki of longer term activities and a feature wishlist which can be converted to PR when somebody gets around to doing them... |
In the 1.3 cycle we've had a number of issues and prs that were tagged as 1.3.x for things that were "punted" from 1.2, and this did a relatively good job of helping those things get done. If we keep milestones for things that are almost ready for a release, or punted from a previous release, and then leave unmilestoned issues for "ongoing work on master", I think it helps the important things rise to the surface. I don't know if adding another index will actually make things any easier wrt finding things. What I don't think we should do is close it -- I think marking it as 1.4.x is the best way to remind ourselves to get it down. But we can take that discussion to |
Yeah I'd like to leave this open, PRs that are still open are easier for me to follow. I'm really sorry this has festered. I'm hoping to find time perhaps during the sprint this week to hit this out of the park. @akhmerov's suggestions shouldn't be too hard to implement. Just got to sit down and do them. Thanks for the ping. Will get down to it this week. |
@pelson Can you play with this interactively? I think your idea of trying to get this merged and making incremental improvements atop is probably better. A separate issue can be opened for improvements. |
Just to be consistent, the demo should probably call |
@@ -0,0 +1,22 @@ | |||
import numpy as np | |||
import matplotlib | |||
matplotlib.use('agg') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems a little unnecessary.
Also -- maybe the example could use a cooler looking function? tan(x) is boring 😦 |
This is now awesome. I decided to keep the number of points with which to partition the domain fixed and it's much smoother. I think I can still do the adaptive refinement, but I'll do it with a fixed number of x-coordinates. That way the window doesn't become over-resolved and sluggish. Edit: Sprinty McSprintington |
1D Callable function plotting. | ||
""" | ||
|
||
import sys |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick: sys isn't used
Good work on this PR! I didn't have time to test it, but it looks pretty cool. I've noticed that you created a new module |
Plotting functionality for Python callables, ala Matlab's 'fplot' function. A fixed step-size is used, but is recalculated to fit the viewport on zooming and panning. This has the effect of 'discovering' more of the function when the user zooms in on the plot. Only one callable is supported so far, it would be desirable to support an array/list of callables and plot them all simultaneously.
@NelleV Should the private module thing be done here or in a new pull request that refactors all of them together? I'm thinking I'd like to keep those separate, but there's certainly an argument to be made otherwise. |
I think as long as we do it before releasing 1.4, it should be fine to
|
I'm going to add another demo to this so we aren't stuck with |
It seems like in an ideal world, we'd want to base the number of points used on the number of pixels in the axes and then |
Actually, for most functions that would be an overkill: the highest resolution should be based on that, while for simple functions there should be way less points. |
@akhmerov: Can you elaborate? My motivation here is to not have a plot that may look fine on-screen have visible segments when printed to a higher resolution device. |
Actually my longer term goal is to utilise #2189 and use |
And by 'event', I mean the |
Number of points not really a valuable characteristic of a plot. If the function is linear, you may be fine with just two points. If you have a certain maximal resolution in mind, that is defined by the device, the number of points required to render a curve smoothly on this device will depend on the curve very strongly. |
The renderer includes a simplification algorithm to remove points that are colinear between their neighbors, and would, in fact, reduce a line to two points. The issue at hand here is to get enough points so that that algorithm can be left to do its job. (But apologies since I was extremely implicit about that). |
One should still keep in mind that in many cases evaluating the function itself may be expensive. Cheap functions usually don't require adaptiveness: there you can just specify a lot of points from the start. |
PR is a year old so I'm closing it. Please feel free to update and re-invigorate this PR and it can be re-opened. Thanks |
Addressing #330.
Right, so I have tried this once (#737), but the consensus seemed that I should encapsulate into a class to allow some kind of 'zooming' effect like in the Mandelbrot example.
This code is not complete, needs improving, and there are probably better ways of doing the things I have done. I'm making this a PR so people can chime in with feedback and such like.