# destructor – How do I correctly clean up a Python object?

## The Question :

483 people think this question is useful
class Package:
def __init__(self):
self.files = []

# ...

def __del__(self):
for file in self.files:



__del__(self) above fails with an AttributeError exception. I understand Python doesn’t guarantee the existence of “global variables” (member data in this context?) when __del__() is invoked. If that is the case and this is the reason for the exception, how do I make sure the object destructs properly?

• Reading what you linked, global variables going away doesn’t seem to apply here unless you’re talking about when you program is exiting, during which I guess according to what you linked it might be POSSIBLE that the os module itself is already gone. Otherwise, I don’t think it applies to member variables in a __del__() method.
• The exception is thrown long before my program exits. The AttributeError exception I get is Python saying it doesn’t recognize self.files as being an attribute of Package. I may be getting this wrong, but if by “globals” they don’t mean variables global to methods (but possibly local to class) then I don’t know what causes this exception. Google hints Python reserves the right to clean up member data before __del__(self) is called.
• The code as posted seems to work for me (with Python 2.5). Can you post the actual code that is failing – or a simplified (the simpler the better version that still causes the error?
• @ wilhelmtell can you give a more concrete example? In all my tests, the del destructor works perfectly.
• If anyone wants to know: This article elaborates why __del__ should not be used as the counterpart of __init__. (I.e., it is not a “destructor” in the sense that __init__ is a constructor.

648 people think this answer is useful

I’d recommend using Python’s with statement for managing resources that need to be cleaned up. The problem with using an explicit close() statement is that you have to worry about people forgetting to call it at all or forgetting to place it in a finally block to prevent a resource leak when an exception occurs.

To use the with statement, create a class with the following methods:

  def __enter__(self)
def __exit__(self, exc_type, exc_value, traceback)



In your example above, you’d use

class Package:
def __init__(self):
self.files = []

def __enter__(self):
return self

# ...

def __exit__(self, exc_type, exc_value, traceback):
for file in self.files:



Then, when someone wanted to use your class, they’d do the following:

with Package() as package_obj:
# use package_obj



The variable package_obj will be an instance of type Package (it’s the value returned by the __enter__ method). Its __exit__ method will automatically be called, regardless of whether or not an exception occurs.

You could even take this approach a step further. In the example above, someone could still instantiate Package using its constructor without using the with clause. You don’t want that to happen. You can fix this by creating a PackageResource class that defines the __enter__ and __exit__ methods. Then, the Package class would be defined strictly inside the __enter__ method and returned. That way, the caller never could instantiate the Package class without using a with statement:

class PackageResource:
def __enter__(self):
class Package:
...
self.package_obj = Package()
return self.package_obj

def __exit__(self, exc_type, exc_value, traceback):
self.package_obj.cleanup()



You’d use this as follows:

with PackageResource() as package_obj:
# use package_obj



58 people think this answer is useful

The standard way is to use atexit.register:

# package.py
import atexit
import os

class Package:
def __init__(self):
self.files = []
atexit.register(self.cleanup)

def cleanup(self):
print("Running cleanup...")
for file in self.files:



But you should keep in mind that this will persist all created instances of Package until Python is terminated.

Demo using the code above saved as package.py:

\$ python
>>> from package import *
>>> p = Package()
>>> q = Package()
>>> q.files = ['a', 'b', 'c']
>>> quit()
Running cleanup...
Running cleanup...



33 people think this answer is useful

As an appendix to Clint’s answer, you can simplify PackageResource using contextlib.contextmanager:

@contextlib.contextmanager
def packageResource():
class Package:
...
package = Package()
yield package
package.cleanup()



Alternatively, though probably not as Pythonic, you can override Package.__new__:

class Package(object):
def __new__(cls, *args, **kwargs):
@contextlib.contextmanager
def packageResource():
# adapt arguments if superclass takes some!
package = super(Package, cls).__new__(cls)
package.__init__(*args, **kwargs)
yield package
package.cleanup()

def __init__(self, *args, **kwargs):
...



and simply use with Package(...) as package.

To get things shorter, name your cleanup function close and use contextlib.closing, in which case you can either use the unmodified Package class via with contextlib.closing(Package(...)) or override its __new__ to the simpler

class Package(object):
def __new__(cls, *args, **kwargs):
package = super(Package, cls).__new__(cls)
package.__init__(*args, **kwargs)
return contextlib.closing(package)



And this constructor is inherited, so you can simply inherit, e.g.

class SubPackage(Package):
def close(self):
pass



20 people think this answer is useful

A better alternative is to use weakref.finalize. See the examples at Finalizer Objects and Comparing finalizers with __del__() methods.

18 people think this answer is useful

I don’t think that it’s possible for instance members to be removed before __del__ is called. My guess would be that the reason for your particular AttributeError is somewhere else (maybe you mistakenly remove self.file elsewhere).

However, as the others pointed out, you should avoid using __del__. The main reason for this is that instances with __del__ will not be garbage collected (they will only be freed when their refcount reaches 0). Therefore, if your instances are involved in circular references, they will live in memory for as long as the application run. (I may be mistaken about all this though, I’d have to read the gc docs again, but I’m rather sure it works like this).

13 people think this answer is useful

I think the problem could be in __init__ if there is more code than shown?

__del__ will be called even when __init__ has not been executed properly or threw an exception.

Source

12 people think this answer is useful

Here is a minimal working skeleton:

class SkeletonFixture:

def __init__(self):
pass

def __enter__(self):
return self

def __exit__(self, exc_type, exc_value, traceback):
pass

def method(self):
pass

with SkeletonFixture() as fixture:
fixture.method()



Important: return self

If you’re like me, and overlook the return self part (of Clint Miller’s correct answer), you will be staring at this nonsense:

Traceback (most recent call last):
File "tests/simplestpossible.py", line 17, in <module>
fixture.method()
AttributeError: 'NoneType' object has no attribute 'method'



Hope it helps the next person.

8 people think this answer is useful

Just wrap your destructor with a try/except statement and it will not throw an exception if your globals are already disposed of.

Edit

Try this:

from weakref import proxy

class MyList(list): pass

class Package:
def __init__(self):
self.__del__.im_func.files = MyList([1,2,3,4])
self.files = proxy(self.__del__.im_func.files)

def __del__(self):
print self.__del__.im_func.files



It will stuff the file list in the del function that is guaranteed to exist at the time of call. The weakref proxy is to prevent Python, or yourself from deleting the self.files variable somehow (if it is deleted, then it will not affect the original file list). If it is not the case that this is being deleted even though there are more references to the variable, then you can remove the proxy encapsulation.

5 people think this answer is useful

It seems that the idiomatic way to do this is to provide a close() method (or similar), and call it explicitely.

0 people think this answer is useful

atexit.register is the standard way as has already been mentioned in ostrakach’s answer.

However, it must be noted that the order in which objects might get deleted cannot be relied upon as shown in example below.

import atexit

class A(object):

def __init__(self, val):
self.val = val
atexit.register(self.hello)

def hello(self):
print(self.val)

def hello2():
a = A(10)

hello2()
a = A(20)



Here, order seems legitimate in terms of reverse of the order in which objects were created as program gives output as :

20
10



However when, in a larger program, python’s garbage collection kicks in object which is out of it’s lifetime would get destructed first.