Python: Context Managers

June 10, 2017
6 min. read

This post is part of the Python Tips series.

Context managers were added to Python with PEP 343. This allows proper handling of resources without worrying about missing something in your try/finally code. If you are not opening files using a context manager, you are most likely doing it wrong.

Looking at opening a file and what could go wrong, will show how context managers make programming both more robust and simpler. The operation can be broken into 3 sections of code.

  • (Setup) Opening the file and making it available for use in Python
  • (Do Work) Reading from or writing to the file
  • (Tear Down) Closing the file and returning the file allocator

If we were to write this code manually, we would only want to proceed past step one, if everything went correctly in step one. Then we would want to make sure we completed step three, even if something happened in step two. If you have a larger block handling the file, then it is easy to overlook the handling of the close.

To open a file properly ourselves, we would need something like this:

f = open('test-file', 'r')
try:
    for line in f.readline():
        # do something cool with line

        pass
finally:
    f.close()

This isn’t too bad, but if we forget to do it, we can be in trouble. If we get in the habit of using a context manager, it is barely more typing than doing it without error handling.

Using a Context Manager

When opening a file with a context manager, our code would look like this:

with open('test-file', 'r') as f:
    for line in f.readline():
        # do something cool with line

        pass

If an error occurs when opening the file, an exception is raised and the code inside the with block is not executed. If an error occurs inside the with block, proper break down still occurs on raising of the error. It really doesn’t matter the size of our block, as we don’t need to handle anything at the end. The with block sets up the equivalent of the finally with one line.

It is also possible to chain context managers. If you wanted something like this:

with outer_cm() as A:
    with inner_cm(A) as B:
        # Do stuff with A and/or B

        pass

Notice how we can use the resource being managed by the outer with, in the opening of the inner with.

In Python 2.7+ or Python 3, you can simplify it to this:

with outer_cm() as A, inner_cm(A) as B:
    # Do stuff with A and/or B

    pass

Context Manager as Class

If the implementation of open did not supply us with a context manager, we could create our own like this:

class RedundantFile(object):
    def __init__(self, filename, mode):
        self.filename = filename
        self.mode = mode
        self._file = None

    def __enter__(self):
        """ Setup that occurs before your code in with """
        self._file = open(self.filename, self.mode)
        return self._file

    def __exit__(self, *args):
        """ Break down code that occurs when we leave the with for any reason """
        self._file.close()

Using the above class, we could do the following:

with RedundantFile('test-file', r) as f:
    for line in f.readline():
    # do something cool with line

    pass

The entire reason I called it redundant is that it just replaces the functionality that is built into the open function. However, if you have your own object, you can add the __enter__ and __exit__ methods and get context manager functionality.

Context Manager as Generator

This is great of you have a whole class. What if you just want to make some simple functionality wrapping around code?

Enter contextlib.

Here is a block of code that doesn’t manage important resources, but is a quick helpful wrapper for timing functions.

from contextlib import contextmanager
import time

@contextmanager
def time_me(process_name):
    start = time.time()
    try:
        yield
    finally:
        print('{} took {} seconds.'.format(process_name, time.time() - start))

with time_me('sleep'):
    time.sleep(2)

Using the contextmanager decorator allows use to define a function that provides the code for a class based __begin__ up until the yield, then code for a class based __exit__ after.

What happens when we call this with exception throwing code?

with time_me('errors'):
    print('About to divide by zero')
    bad = 1/0
Traceback (most recent call last):
About to divide by zero
  File "C:/Users/micro/repositories/scratchpad/open-file-nightmare.py", line 19, in <module>
errors took 0.0 seconds.
    bad = 1/0
ZeroDivisionError: division by zero

The finally code executes and we see that Python can throw an exception exceptionally fast at 0.0 seconds. (Sorry.)

We talked about the equivalents with a class based context manager, instead of using contextlib above. There is no reason to, other than to show you how these compare.

class TimeMe(object):
    def __init__(self, process_name):
        self.process_name = process_name

    def __enter__(self):
        self.start = time.time()

    def __exit__(self, exception_type, exception_value, traceback):
        print('{} took {} seconds.'.format(self.process_name, time.time() - self.start))

So we instantiate the object with the process_name and store it in the class. When __etner__ occurs, we need to store the time. When __exit__ occurs, we print out the results. I also needed to change the name to a proper standards upper camelcase.

Do the arguments to __exit__ look longer than before? Look back at the File class we created. Notice *args in there? DId you catch that the first time through and wonder about it. (If so, kudos. If not, you can act like you did and I won’t know. Seriously, I’m a static web page. I’m not that smart.)

I broke them out into actual arguments here so you can tell what they are. If you do not have a catchall positional argument handler like *args or seperate fields like this, you will get an error as __exit__ will be called with 4 arguments. If no exception has occurred, these values will be None. This allows you to do additional or different things on break down if internal code raised an exception.

Lets do one more function based context manager with contextlib, and redo our redundant file manager. This is important to show, as our time_me implementation above did not show how to return a created object.

@contextmanager
def redundante_open(filename, mode):
    f = open(filename, mode)
    try:
        yield f
    finally:
        f.close()

So this isn’t brain surgery, we just yield it, instead of yielding nothing. Pretty simple, but I wanted to make sure to show it.

Now I’m wondering how much Python is used in brain surgery. Now you are too. Static web pages may be dumb, but they can plant crazy thoughts into your mind.

Possible uses of Context Managers

If I did not have a common handler for hardware interfaces on an embedded Python project, a context manager would be great to use for this. For example, on a Raspberry Pi, you will get warnings and cause issues if you try to open the GPIO interface more than once. This would not be performant, unless your hardware access occurs only rarely.

If you are using threading in Python, safely and religiously acquiring and releasing the lock is the only way to keep your sanity. This is a good place to use a context manager.

A database connection, serial connection, or network connection are just like a file. You generally don’t want to just leave open ones lying around.

Look for context manager interfaces to libraries you use. Most we designed ones that have setup and breakdown have these created for them. If it doesn’t exist, you have the ability to create a wrapper to give you this functionality as we did for files.


Part 6 of 9 in the Python Tips series.

Series Start | Python: Unders and Dunders | Python: Iterators and Generators

comments powered by Disqus