performance?

I have a general performance question - is matplotlib

    > expected to deal well with large plots when you are zoomed
    > into a small region? That is, if I have 100K points in a
    > graph but have set my limits so that only 100 of them are
    > visible at one time, should that be comparable to creating
    > a plot with only the visible items and re-creating it each
    > time the limits are moved? (Assuming the X axis is
    > monotonic and I do the obvious type of filtering in
    > Python.) It seems to be a bit sluggish...

There is a data filtering mode already built in, which is designed for
the case, as you suggest, that x is monotonic. To use it, you set the
property 'data_clipping=True' as in

  plot(x, y, data_clipping=True)

However, I tried it in your test case and it doesn't help a lot. The
reason is the numeric clip is too time consuming. However, I think I
know a good fix. In the current implementation, the data is clipped
to the view limits, and every time the view limits are moved the clip
has to be performed again. It would be better to clip to for example

    vmax, vmin = ax.get_xlim()
    interval = vmax-vmin
    clipmin = vmin-interval
    clipmax = vmax+interval

and only update the clip lim when a new vlim / vmax exceeds them.
Then you would only have top perform the clip occasionally, which
would be a big when if you have a large x range and have zoomed into a
narrow part of the interval.

I'll take a stab at it.

JDH