I was using matplotlib to plot 2d arrays (or equivalently images), getting
the user to choose some points, getting the x and y coordinates of those
points, and then doing calculations. Keeping track of indices proved to be
a source of frustration.
This is because, given a 2d array, two different interpretations are
possible. In the first one, the first index is the (horizontal) x-axis and
the second index is the (vertical) y-axis. And in the other one, the first
index is the row (vertical axis) and the second one is the column
Matplotlib has always veered towards the second interpretation, perhaps
because it has been modelled on matlab, where everything is a 2d matrix (or
higher dimensional). But in numpy, where 1d arrays are first class objects,
this can get confusing.
For example, if I were acquiring data as a function of two variables which
go on the x-axis and y-axis
for x in x_vals:
for y in y_vals:
result[x,y] = f(x,y)
either I would have to use contourf(x_vals, y_vals, result.T) or use
result[y,x] = f(x,y). Most programmers would probably be used to this, but
it's easy to slip up with large programs.
The situation becomes even more confusing with imshow and imread because
the origin is placed on the upper left corner by default. I could use the
keyword origin=lower to avoid this but this changes the image orientation
as well. So the upshot is that if I load an image with imread(..) and want
to display the image in the same orientation as imshow(origin=upper) would
but with the origin placed on the lower left corner, I would have to go
through various rot90 & flip(ud/rl) operations. (As an aside, IMHO it is
imread that should have the origin option and not imshow)
It would be nice if there could be a specialized 2d array class (possibly
subclassing ndarray) with additional information on how the array is to be
interpreted and routines to convert from one to the other.
I wanted other opinions on whether this is desirable/feasible?
-------------- next part --------------
An HTML attachment was scrubbed...