Hello John and other devels,
John, you were right on the money in relation to the memory leak that I wrote
about in the previous message. Here's a rough draft of an attempt to patch
that up. This is my first (meaningful) attempt to submit a patch anywhere,
so please forgive any errors in the process. It was made with >patch -p ...
I think that is the accepted way to make a patch, but I'm not sure.
At first I wanted to plug this up by resuing the "cached" variable (by
removing the last key from the .keys() method if the number of entries got
too big), but I didn't like how that turned out so I decided to add a list
which was sorted by last use. Removal when there became too many items was
performed on the least recently used entry. I think that this way turned out
for the better.
What might improve things is if there was a way to check how big the data
structures which we are caching are in memory. The first thought that came
to my mind was to dump the structures to a Python string via cPickle and
count the bytes used, but this would be too slow I think. At the very least,
the value could be supplied by the .matplotlibrc file, but I'm not very sure
where all those values are read in.
Anyways too much talk for such a small fix,
Take care and happy coding,
matplotlib_textcache_patch.p (645 Bytes)