Polishing our forthcoming console game, our team in Shanghai are relentlessly trying to minimize python memory use.
Today, an engineer complained to me that “cell” objects were being leaked(*).
This rang a bell with me. In 2009, I had posted about this to python-dev.
The response at the time wasn’t very sympathetic. I should be doing stuff differently or simply rely on the cyclic garbage collector and not try to be clever. Yet, as I pointed out, parts of the library are aware of the problem and do help you with these things, such as the xml.dom.minidom.unlink() method.
The data being leaked now appeared to pertain to the json module:
[2861.88] Python: 0: <bound method JSONEncoder.default of <json.encoder.JSONEncoder object at 0x12e14010>>
[2861.88] Python: 1: <bound method JSONEncoder.default of <json.encoder.JSONEncoder object at 0x12e14010>>
[2861.88] Python: 2: <bound method JSONEncoder.default of <json.encoder.JSONEncoder object at 0x12e14010>>
…
This prompted me to have a look in the json module, and behold, json.encoder contains this pattern:
[python]
def _make_iterencode(…)
…
def _iterencode(o, _current_indent_level):
if isinstance(o, basestring):
yield _encoder(o)
elif o is None:
yield ‘null’
elif o is True:
yield ‘true’
elif o is False:
yield ‘false’
elif isinstance(o, (int, long)):
yield str(o)
elif isinstance(o, float):
yield _floatstr(o)
elif isinstance(o, (list, tuple)):
for chunk in _iterencode_list(o, _current_indent_level):
yield chunk
elif isinstance(o, dict):
for chunk in _iterencode_dict(o, _current_indent_level):
yield chunk
else:
if markers is not None:
markerid = id(o)
if markerid in markers:
raise ValueError(“Circular reference detected”)
markers[markerid] = o
o = _default(o)
for chunk in _iterencode(o, _current_indent_level):
yield chunk
if markers is not None:
del markers[markerid]
return _iterencode
[/python]
The problem is this: The returned closure has a func_closure() member containing the “cell” objects, one of which points to this function. There is no way to clear the func_closure method after use. And so, iterencoding stuff using the json module causes reference cycles that persist until the next collection, possibly causing python to hang on to all the data that was supposed to be encoded and then thrown away.
Looking for a workaround, I wrote this code, emulating part of what is going on:
[python]
def itertest(o):
def listiter(l):
for i in l:
if isinstance(i, list):
chunks = listiter(i)
for i in chunks:
yield i
else:
yield i
return listiter(o)
[/python]
Testing it, confirmed the problem:
>>> import celltest
>>> l = [1, [2, 3]]
>>> import gc, celltest
>>> gc.collect()
>>> gc.set_debug(gc.DEBUG_LEAK)
>>> l = [1, [2, 3]]
>>> i = celltest.itertest(l)
>>> list(i)
[1, 2, 3]
>>> gc.collect()
gc: collectable <cell 01E96B50>
gc: collectable <function 01E97330>
gc: collectable <tuple 01E96910>
gc: collectable <cell 01E96B30>
gc: collectable <tuple 01E96950>
gc: collectable <function 01E973F0>
3
To fix this, it is necessary to clear the “cell” objects once there is no more need for them. It is not possible to do this from the outside, so how about from the inside? Changing the code to:
[python]
def itertest2(o):
def listiter(l):
for i in l:
if isinstance(i, list):
chunks = listiter(i)
for i in chunks:
yield i
else:
yield i
chunks = listiter(o)
for i in chunks:
yield i
chunks = listiter = None
[/python]
Does the trick. the function becomes a generator, yields the stuff, then cleans up:
>>> o = celltest.itertest2(l)
>>> list(o)
[1, 2, 3]
>>> gc.collect()
0
It is an unfortunate situation. The workaround requires work to be done inside the function. It would be cool if it were possible to clear the function’s closure by calling, e.g. func.close(). As it is, people have to be aware of these hidden cycles and code carfully around them.
(*) Leaking in this case means not being released immediately by reference counting but lingering. We don’t want to rely on the gc module’s quirkiness in a video game.
Update:
In my toy code, I got the semantics slightly wrong. Actually, it is more like this:
[python]
def make_iter():
def listiter(l):
for i in l:
if isinstance(i, list):
chunks = listiter(i)
for i in chunks:
yield i
else:
yield i
return listiter
def get_iterator(data):
it = make_iter()
return it(data)
[/python]
This complicates things. Nowhere is, during iteration, any code running in the scope of make_iter that we can use to clear those locals after iteration. Everything is running in nested functions and since I am using Python 2.7 (which doesn’t have the “nonlocal” keyword) there seems to be no way to clear the outer locals from the inner functions once iteration is done.
I guess that means that I’ll have to modify this code to use class objects instead.
Also, while on the topic, I think Raymond Hettinger’s class-like objects are subject to this problem if they have any sort of mutual or recursive relationship among their “members”.
Assuming I’ve understood your problem correctly, here’s a bit of black magic that should let you replace any closure with a circular reference you can forcibly clear (by setting f.__defaults__ = None):
Modulo the web form destroying the signficant whitespace even between the code tags 😛
Hm, can’t get it to format correctly. No matter.
Yes, this is interesting, but you are sidestepping the problem by not actually using closures and cells.
The problem I”m having is that I”m trying to use the standard library and fix it with minimal fuss. i’d be careful to write code differently myself, you see. In the end, I”ll probably replace this functionality with a proper iterator class, rather than the closure method currently employed in the standard library.
While it doesn’t help you right now, longer term, __closure__ is going to be writable to some degree, so it will actually be possible to break such cycles manually.
Relevant issue is http://bugs.python.org/issue14369
Ok, so clearing the __closure__ of the closure function will release it, but actually the linchpin is the “local” scope of the outer frame. If it were possible to go there and clear all the “cell” methods in that frame, that should work.
Have you thought about using weakref.proxy?
For a non-generator function you can just do
def make_factorial():
def factorial(n):
if n <= 1:
return 1
return n * factorial(n - 1)
tmp = factorial
factorial = weakref.proxy(factorial)
return tmp
It is harder with generators because the generator function is liable to
be garbage collected before the iterator has finished running. However,
you can try
def itertest(o):
def listiter(l):
for i in l:
if isinstance(i, list):
chunks = listiter(i)
for i in chunks:
yield i
else:
yield i
res = listiter(o)
keep_alive(res, listiter)
listiter = weakref.proxy(listiter)
return res
where keep_alive is defined as
_set_of_weakrefs = set()
def keep_alive(first, second):
# ensure first is garbage collected before second
def callback(wr, _set_of_weakrefs=_set_of_weakrefs, second=second):
_set_of_weakrefs.remove(w)
w = weakref.ref(second, callback)
_set_of_weakrefs.add(w)
(formatting in comments is broken, sorry about that)
Interesting approach. This is actually one of the suggestions I had in the email thread on python-dev, that “cell” objects could hold weak references. This approach at least allows one to experiment with that.
That should have been
def keep_alive(first, second):
# ensure first is garbage collected before second
def callback(wr, _set_of_weakrefs=_set_of_weakrefs, second=second):
_set_of_weakrefs.remove(w)
w = weakref.ref(first, callback)
_set_of_weakrefs.add(w)