after some successful attempt at migrating routines from several plotting
and/or analysing softwares (pgplot, Midas, Iraf, etc) to Matplotlib
I now hit a big wall: the speed at which it can display a reasonably large image:
- I have a ''local'' library to read fits files [2D images] in python (after SWIG wrapping).
It converts these files into float - nmerix - arrays. I was then testing the ability
of matplotlib to display these 2D arrays.
Until now all was fine since I tested matplotlib on very small images (50x50 pixels).
Yesterday I tried with a 1600x1600 pixels image. NOTE: this is a very reasonable
size (and typical) in my work and I expect much much bigger ones in a near future
(up to 20 000 x 20 000).
==> Matplotlib takes ~20 seconds to display it !!!
(after 12 seconds the window opens, and then it takes
another 8 seconds to be displayed)
(as compared to less than .2 sec for Midas, Iraf and others so a factor of 100 at least!!!
and less than a second using the ppgplot routines).
running on a 1.6 Ghz/512 RAM centrino, linux, backend TkAgg, numarray, float array of 1600x1600 pixels.
Using either imshow or figimage in ''ipython -pylab'' (tried different interpolation schemes
the one I want being ''nearest'' in general)
To be frank, this is a killer for me, since I need to work on such images
(signal processing, analysing) and display them each time changing the
levels, centring etc etc. There is no way I will wait for 1 mn for 3 successive displays...
So the question now:
- am I doing something wrong (backend, way to load the array, ...) or is it
intrinsic to matplotlib ?
- is there a way out? (basically the same question..)
If there is no way out, I may have to just abandon (sigh) matplotlib for displaying images.
But if there is one, please let me know!!