event handling

I think now is a good time to start implementing some GUI neutral
event handling and wanted to bounce some ideas off the list. The idea
is for matplotlib to define its own event handler in matplotlib.events
and each of the GUIs trigger these events. Users could register with
the event handler using an observer pattern. In user space, you could
do something like the following and expect it to work across Tk, Wx
and GTK::

    import matplotlib.events as events

    def key_press(event):
        print event.key

    events.connect('key_press', key_press)

    def button_press_press(event):
        print event.button

    events.connect('button_press', button_press)

Each of the GUIs would call matplotlib.events.notify. The backends
would have to map the GUI constants, eg LEFT_BUTTON, to the
appropriate matplotlib.event constant and handle things like flip y
inversion since the matplotlib coordinate system is 0,0 = lower, left.

This would also lay the groundwork for generalizing things like
examples/object_picker.py, writing generic measure tools, and so on.
You could write a GUI neutral axes resize tool where the user could
click on an axes and drag the size. So it would be useful for the
users who want to define some GUI interaction and or the developers
who want to add features across backends w/o having to know all the
widget sets.

My questions are

  * Does this look like a good framework?

  * Does someone want to implement the interface matplotlib.events?

  * Do the GUI maintainers for Tk (Todd), Wx (Jeremy) and GTK (Me)
    have time to do the mapping? If not can we get another volunteer?

  * What events do we want to define? The ones I use a lot are::

        key_press_event
        key_release_event
        motion_notify_event
        button_press_event
        button_release_event

    If these basic events are implemented at the backend level, we
    could catch them and trigger more complicated events like
    'drag_rectangle' in matplotlib.events. Ie, catch press, mouse
    move, and release, and then trigger drag_rectangle with start and
    end locations in event.start and event.end.

  * There is the issue of coords. x, y can be given in data, axes
    (0-1) or data coords. Probably best to give all three

      event.xdata, event.ydata,
      event.xaxis, event.yaxis,
      event.xdisplay, event.ydisplay

    See examples/coords_demo_gtk.py for an example of connecting to a
    GTK event and mapping the display coords to data space. Jeremy
    and Todd, this would be a good example to implement as stage 1 for
    the event handling.

Of course, there is no end to how far you could go with this, defining
a basic cross GUI widget API for dialog boxes, entry boxes and so on.
A meta-wx, if you will. But that is an issue for another day.

JDH

I can't really comment on the framework (sounds OK but I have no experience
in this area), but you might want to add mouse wheel events and maybe both
single and double-click mouse button events.

Gary

*********** REPLY SEPARATOR ***********

<snip>

···

On 16/04/2004 at 07:09 John Hunter wrote:

  * What events do we want to define? The ones I use a lot are::

        key_press_event
        key_release_event
        motion_notify_event
        button_press_event
        button_release_event

I think now is a good time to start implementing some GUI neutral
event handling and wanted to bounce some ideas off the list. The idea
is for matplotlib to define its own event handler in matplotlib.events
and each of the GUIs trigger these events. Users could register with
the event handler using an observer pattern. In user space, you could
do something like the following and expect it to work across Tk, Wx
and GTK::

    import matplotlib.events as events

    def key_press(event):
        print event.key

    events.connect('key_press', key_press)

    def button_press_press(event):
        print event.button

    events.connect('button_press', button_press)

Each of the GUIs would call matplotlib.events.notify. The backends
would have to map the GUI constants, eg LEFT_BUTTON, to the
appropriate matplotlib.event constant and handle things like flip y
inversion since the matplotlib coordinate system is 0,0 = lower, left.

This would also lay the groundwork for generalizing things like
examples/object_picker.py, writing generic measure tools, and so on.
You could write a GUI neutral axes resize tool where the user could
click on an axes and drag the size. So it would be useful for the
users who want to define some GUI interaction and or the developers
who want to add features across backends w/o having to know all the
widget sets.

My questions are

  * Does this look like a good framework?

The meaning of the coords_demo_gtk was clear and easy to emulate. The
meaning of the events interface above (in the context of multiple figure
managers) is not so clear.

  * Does someone want to implement the interface matplotlib.events?

  * Do the GUI maintainers for Tk (Todd), Wx (Jeremy) and GTK (Me)
    have time to do the mapping? If not can we get another volunteer?

I'm not imagining this as a major time sink so I'm pretty sure I'll be
able to support this.

  * What events do we want to define? The ones I use a lot are::

        key_press_event
        key_release_event
        motion_notify_event
        button_press_event
        button_release_event

    If these basic events are implemented at the backend level, we
    could catch them and trigger more complicated events like
    'drag_rectangle' in matplotlib.events. Ie, catch press, mouse
    move, and release, and then trigger drag_rectangle with start and
    end locations in event.start and event.end.

  * There is the issue of coords. x, y can be given in data, axes
    (0-1) or data coords. Probably best to give all three

      event.xdata, event.ydata,
      event.xaxis, event.yaxis,
      event.xdisplay, event.ydisplay

    See examples/coords_demo_gtk.py for an example of connecting to a
    GTK event and mapping the display coords to data space. Jeremy
    and Todd, this would be a good example to implement as stage 1 for
    the event handling.

I implemented the coords demo for TkAgg this morning (just replace GTK
with TkAgg in the demo). I did this by implementing connect() for
FigureCanvasTkAgg... very simple so far.

Of course, there is no end to how far you could go with this, defining
a basic cross GUI widget API for dialog boxes, entry boxes and so on.
A meta-wx, if you will. But that is an issue for another day.

I can see the utility of GUI neutrality and I'm in favor of it, but I'm
also inclined to keep it as simple as possible. I think it would be
easy to go overboard and start worrying about creating a meta-GUI
instead of a plotter. If we drive the backend development with demos
and concrete applications (rather than some abstract idea of universal
GUI substitutability) I think the idea will turn out well.

Regards,
Todd

···

On Fri, 2004-04-16 at 08:09, John Hunter wrote:

--
Todd Miller <jmiller@...31...>