Hallöchen!
I visualize experimental data in quasi-realtime while it's acquired.
The x-axis is fixed right from the beginning. However, the y-axis
must be autoscaled depending on the data that's coming in.
So far, I do the following on Windows:
···
#--------------------------------------------------
ion()
NaN = 0.0
x_values = arange(0, 2 * zyklen, 1.0 / number_of_values)
y_values = total_number_of_values * [NaN]
line, = plot(x_values, y_values)
for halfcycle in range(zyklen*2):
keithley.write("F0B2M2G0T2Q%dI%dX" % (milliseconds, number_of_values))
keithley.trigger()
keithley.wait_for_srq()
# Here I read the values
voltages = keithley.read_floats()
# Insert the values in the set of data values
y_values[halfcycle * number_of_values : (halfcycle+1) * number_of_values] = \
voltages
line.set_ydata(y_values) # Plot update
gcf().autoscale_view() # This has no effect
draw()
time.sleep(1)
#--------------------------------------------------
I have three questions:
1. On Linux, I can say NaN=float("NaN"), which suppresses plot
points in an area where no plot exists so far. On Windows, I seem
to be forced to set the un-measures interval to "0". This is
uglier, is there a better way?
2. The above loop is the last part of my program. The program
terminates with
Fatal Python error: PyEval_RestoreThread: NULL tstate
Apparently, this is done by my improper handling of matplotlib.
What's going wrong?
3. How can I achieve that dynamic autoscaling of the y-axis? As
noted in the source, "gcf().autoscale_view" has no effect.
Thank you!
Tschö,
Torsten.
--
Torsten Bronger, aquisgrana, europa vetus