I face a problem here, which I can’t seem to handle by myself, so any help is really appreciated.
I would like to do a simple line plot of a huge dataset as an overview to quickly compare success of different measurement scenarios, and it seems that not every datapoint is displayed. I played a little with the lod parameter, both for the creation of
the axis and the plot command. However timing the plot command and the display itself do not show differences. Here are a few lines of code that help to reproduce the problem.
import matplotlib.pyplot as plt
import numpy as np
xData=np.linspace(0, 10.0, 1e6)
xDataDetail=np.linspace(0.0, 2*np.pi, 1000)
axes.plot(xData, yData, “b-”)
print “Plotting took %g s.” % (toc-tic)
The code shows how I usually use the matplotlib environment and creates a simple dataset of 1 million zeros with a short non trivial peak within, that is to be plotted as a blue solid line.
You can see what happens, when you vary the width of the displaying window. On my system usually the minimum amplitude varies when resizing the window.
Is there any way to enforce plotting each and every point?
I use matplotlib version 1.0.0 on a 32 Bit windows XP system installed via the windows installer from sf.
A quick check on a opensuse 11.3 linux box showed the same issue. Using the “standard” TK backend instead of Qt4Agg behaves just the same.