John, you were right on the money in relation to the memory leak that I wrote
about in the previous message. Here's a rough draft of an attempt to patch
that up. This is my first (meaningful) attempt to submit a patch anywhere,
so please forgive any errors in the process. It was made with >patch -p ...
I think that is the accepted way to make a patch, but I'm not sure.
At first I wanted to plug this up by resuing the "cached" variable (by
removing the last key from the .keys() method if the number of entries got
too big), but I didn't like how that turned out so I decided to add a list
which was sorted by last use. Removal when there became too many items was
performed on the least recently used entry. I think that this way turned out
for the better.
What might improve things is if there was a way to check how big the data
structures which we are caching are in memory. The first thought that came
to my mind was to dump the structures to a Python string via cPickle and
count the bytes used, but this would be too slow I think. At the very least,
the value could be supplied by the .matplotlibrc file, but I'm not very sure
where all those values are read in.
John, you were right on the money in relation to the memory leak that I wrote about in the previous message. Here's a rough draft of an attempt to patch that up. This is my first (meaningful) attempt to submit a patch anywhere, so please forgive any errors in the process. It was made with >patch -p ... I think that is the accepted way to make a patch, but I'm not sure.
*** text.py Tue Jan 18 21:03:16 2005
--- /root/text.py Tue Jan 18 20:58:56 2005
*************** class Text(Artist):
*** 44,49 ****
--- 44,50 ----
Artist.__init__(self)
self.cached = {}
+ self.usage_stack =
self._x, self._y = x, y
if color is None: color = rcParams['text.color']
*************** class Text(Artist):
*** 193,198 ****
--- 194,203 ----
ret = bbox, zip(lines, whs, xs, ys)
self.cached[key] = ret
+ self.usage_stack.insert(0, key)
+ if len(self.usage_stack) > 128:
+ self.usage_stack.pop()
+ del self.cached[key]
return ret
Just a minor comment: insert(0) is an expensive (O(N)) operation on lists. This kind of problem might be better addressed with a deque (3.13.0a2 Documentation, could be backported to py2.3).
Just a minor comment: insert(0) is an expensive (O(N)) operation on lists. This kind of problem might be better addressed with a deque (3.13.0a2 Documentation, could be backported to py2.3).
Sorry for the extra msg, here's a pure python backport of deque by R. Hettinger, the author of the C collections module in py2.4: