contour leaks

I was very happy to see the new contour function. I had written my own
and meant to submit it, but thankfully the new one is much faster
anyway.

However, my scripts are getting killed due to excessive memory usage.
I'm doing about 14 contour plots (and some other pretty mundane plots)
in the same figure (clf() between each one) and writing them out to
postscript files. The grid for these is over a million points.

However, about half way through I've eaten up all the swap and the
memory size is over 700MB.

Following the FAQ I edited the memory script (the script is included
below), and it claims that we're losing about 3MB per loop. Is this a
known issue? Any workarounds? Doing gc cleanup every iteration doesn't
help, so I assume from this and the blazing speed that it's written in C
and is leaking there (perhaps not Py_UNREFing the right things or
something).

--- reproduction script ---

import os, sys, time
import matplotlib
matplotlib.use('Agg')
from pylab import *

def report_memory(i):
    pid = os.getpid()
    a2 = os.popen('ps -p %d -o rss,sz' % pid).readlines()
    print i, ' ', a2[1],
    return int(a2[1].split()[1])

def func3(x, y):
    return (1- x/2 + x**5 + y**3)*exp(-x**2-y**2)

# take a memory snapshot on indStart and compare it with indEnd
indStart, indEnd = 30, 150
for i in range(indEnd):
    clf()
    dx, dy = 0.01, 0.01
    x = arange(-3.0, 3.0, dx)
    y = arange(-3.0, 3.0, dy)
    X, Y = meshgrid(x, y)
    Z = func3(X, Y)

    contour(X, Y, Z, 15)

    savefig('tmp%d' % i)

    val = report_memory(i)
    # wait a few cycles for memory usage to stabilize
    if i==indStart: start = val

end = val
print 'Average memory consumed per loop: %1.4fk bytes' % \
    ((end-start)/float(indEnd-indStart))

---- end script ----

Thanks,
jack.