May 20, 2017
9 min. read
This post is part of the Python Tips series.
If you came directly to this article, you might want to read my previous on in this series with the link at the bottom. We discussed the dangers of using mutable values for defaults in a function parameter. In this article I’ll discuss two things: using mutable defaults for good and decorators. In the end, combining these together.
Using the default mutable for good
In developing a Python workflow for a custom system, I’m interfacing with custom hardware to measure current and temperature. The hardware call is much slower than the normal software. These values will only change slowly compared to software (especially temperature).
I am writing a workflow that starts cycling slowly and eventually speeds up to faster than 1 second. I’m going through 144 members of the workflow and could call these functions in each one. This is really a waste in processing time. What if I had a system to cache values and only really call hardware after a certain amount of time. This would be useful in not just hardware interfacing, but also in getting other expensive resources (like network queries) for something that doesn’t change rapidly.
I could make some type of complex systems that manages calls to these hardware functions only when needed. Surely there is an easier way. (Programmers continually ask that. Usually it is a good thing.)
What if the function itself could hold the call and only perform the expensive operations after a certain time? That would be pretty cool. Lets do that.
import time
def get_time(immediate=False, _cached={'last_call': 0, 'value': None}):
"""
Only calls expensive code if not called in a while. Unless told to be immediate.
:param immediate: If True, ignore timing and make call immedately
Note: _cache is not mentioned, as we don't want the uset to pass in anything for _cache and mess up things.
"""
CALL_TIME = 1 # Make call if 1 second has passed
cur_time = time.time()
if immediate or (cur_time - _cached['last_call']) > CALL_TIME:
_cached['last_call'] = cur_time
# simulate expensive call
time.sleep(0.1)
_cached['value'] = time.time()
return _cached['value']
In the function definition, we have an immediate
parameter that is used to bypass the time cache. This will call the expensive code now. The _cached
parameter defines a dictionary that stores data with te function between calls. We have ‘last_call’ that will store the time we last called the expensive code. I stare with 0, so any call will trigger a trip through expensive code. ‘value’ will hold the result of the expensive call.
CALL_TIME
is a constant that sets the time we wait until next expensive call. Here we are doing the expensive call if 1 second has passed. I kept this short, so the tests run faster.
The if
checks if we are calling in immediate mode or enough time has passed. Inside we update our cached call time and get the value. I’m using time.sleep
for 1/10th of a second to simulate slow call. Then storing time. This is easy to see the value has changed.
Lets defined a test that checks for a cached value for a second call, tests immediate call and waits past the cached time and see if the value changes.
def test_cached_with_immediate():
time_a = get_time() # Should be current
time_b = get_time() # Should be cached time_a
assert time_a == time_b
# Sleep past cache and see value change
time.sleep(1)
time_c = get_time()
assert time_c > time_a
# Ignore cache and get current value
time_d = get_time(immediate=True)
assert time_d > time_c
This test passes and it one of the slowest tests I’ve written in a while. 1.33 seconds runtime. Aren’t we glad I didn’t make it 30 second cache. Testing would be a full commercial break.
This works how we want, but we need to add this caching to multiple hardware call. So we just copy all the parts of this to each function and move along, right?
Woah, there. Any time you think about copy and pasting around code, you are doing things wrong. Python has a methodology to easily modify function called decoration. (Its not just for holidays anymore!)
Decorators
I’m going to start simple and work up, because a decorator to add the above functionality is not simple. Functions in Python are first-class objects. You can use them as arguments and pass then around. You can also return them. So it is possible to make a function that creates and returns a function. Lets slowly work our way up in complexity.
This is a simple function and call. Nothing should be new here.
def first(value):
return 'first({})'.format(value)
print(first(1))
first(1)
This passes in a function and executes it inside of outer
.
def outer(func, value):
return func(value)
print(outer(first, 2))
first(2)
Here we show that you can define functions inside of hold_function
. These can only be executed inside the function. But notice how they have access to variable defined outside their function definitions for value
.
def hold_function(value):
def inner_1():
return 'inner_1({})'.format(value)
def inner_2():
return 'inner_2({})'.format(value)
print(inner_1())
print(inner_2())
hold_function(3)
inner_1(3)
inner_2(3)
Here return_function
creates a function and returns it. We assign this to my_func
and it becomes a callable function.
def return_function():
def internal_function():
return 'internal_function'
return internal_function
my_func = return_function()
print(my_func())
internal_function
This is the simplest version of a decorator. Calling my_decorator
allows you to pass in a function and wraps code around it. We call my_function
first and just get the single output. Then we reassign my_function
to the internal wrapper
function returned from my_decorator
. Executing this shows the wrapping with the original call in the middle.
def my_decorator(func):
def wrapper(*args):
print("Something is done before function call.")
func()
print("Something is done after function call")
return wrapper
def my_function():
print("Wheee!")
my_function()
my_function = my_decorator(my_function)
my_function()
Wheee!
Something is done before function call.
Wheee!
Something is done after function call
This uses the same my_decorator
as the previous example, but is using syntactic sugar in Python for using decorators. Using @my_decorator
above a function definition is equivalent to add_sugar = my_decorator(add_sugar)
.
@my_decorator
def add_sugar():
print("Syntactic Sugar")
add_sugar()
Something is done before function call.
Syntactic Sugar
Something is done after function call
Common decorators
The first decorators you run into with Python is generally in classes and object programming. Some examples are @classmethod
indicating the method is at class level, @staticmethod
indicating a method doesn’t require object state to run, and @property
making the method act like a value of the class.
A complex caching decorator
Now that we know what decorators are and how to make them, I’m going to throw a fairly complex one at you and try to explain it. I’ll include the full version of my decorators.py
file in two parts and try to talk you throught it.
I’m importing time
and wraps
for the next block of code. This simple_decorator
decorator is used to copy the properties of the original function onto the one returned. This makes it behave a little better in various ways when in use. It isn’t specifically needed, but makes things nicer.
import time
from functools import wraps
def simple_decorator(decorator):
"""
This decorator can be used to turn simple functions
into well-behaved decorators, so long as the decorators
are fairly simple. If a decorator expects a function and
returns a function (no descriptors), and if it doesn't
modify function attributes or docstring, then it is
eligible to use this. Simply apply @simple_decorator to
your decorator and it will automatically preserve the
docstring and function attributes of functions to which
it is applied.
"""
def new_decorator(f):
g = decorator(f)
g.__name__ = f.__name__
g.__doc__ = f.__doc__
g.__dict__.update(f.__dict__)
return g
# Now a few lines needed to make simple_decorator itself
# be a well-behaved decorator.
new_decorator.__name__ = decorator.__name__
new_decorator.__doc__ = decorator.__doc__
new_decorator.__dict__.update(decorator.__dict__)
return new_decorator
Here is our cached_with_immediate
decorator definition. Since the timing of the cache will vary between uses, this is a passed in argument. You can see an example of use at the bottom of the doc string.
The _cached_with_immediate(main_func)
is the function we are returning with the decorator. You can see this on the last line.
The _decorator
function has *args
to catch positional and **kwargs
to catch named arguments to the original function. We are adding our _cached
argument that we don’t expect anyone to call, and the immediate
named parameter to allow cache-less call.
The code inside that function looks very similar to what we had up top. Because we do the same thing wrapped around the call to main_func(*args, **kwargs)
.
The functool.wraps
call is used to get the name of the function proper, instead of using the inside function name.
def cached_with_immediate(call_time):
"""
Decorator that only calls expensive operations if past last call time.
Can specify immediate=True to make a call ignoring cached condition.
This is useful for using value of long running hardware processes when immediate value is not normally
needed. Such as a temperature conversion that can only vary much slower than code may call it. So a
few millisecond hardware process returns much faster if called often, as the hardware query and conversion
is skipped.
Using property of default _cached dictionary staying with the function definition.
Example:
@decorators.cached_with_immediate(call_time=30)
def long_time_to_run_normally():
return something_that_took_a_long_time_to_get
call function with immediate=True to ignore caching
"""
@simple_decorator
def _cached_with_immediate(main_func):
def _decorator(*args, _cached={'last_call': 0, 'value': None}, immediate=False, **kwargs):
cur_time = time.time()
if immediate or (cur_time - _cached['last_call']) > call_time:
_cached['last_call'] = cur_time
_cached['value'] = main_func(*args, **kwargs)
return _cached['value']
return wraps(main_func)(_decorator)
return _cached_with_immediate
The reason this seems to have one more level of functions than previous decorators, is that the invocation of the decorator is actually a function call. (The use is @function()
instead of @function
. So the decorating creates a decorator using arguments given).
In operation, cached_with_immediate
is called in defining the decorator with @cached_with_immediate(call_time=30)
. This returns the _cached_with_immediate
function that is executed when the function is decorated. The act of decorating returns the _decorator
function, which replaces the original main_func
.
Makes sense? Hopefully.
Testing the decorator
Below is the code I’m using to test my decorator. Notice how this get_time
function has the same operation as the function we defined at the top of this post, just using our decorator.
I actually created it from this, by replacing the decorator functionality with native code. This is why the tests are exactly the same.
@cached_with_immediate(call_time=1) # Will cache for 2 seconds
def get_time():
time.sleep(0.01) # Assure time changes between calls
return float(time.time())
@pytest.mark.slow
def test_cached_with_immediate():
time_a = get_time() # Should be current
time_b = get_time() # Should be cached time_a
assert time_a == time_b
# Sleep past cache and see value change
time.sleep(1)
time_c = get_time()
assert time_c > time_a
# Ignore cache and get current value
time_d = get_time(immediate=True)
assert time_d > time_c
If you want to look at the code, this is part of my rpi-hardware project on GitHub. Testing is in tests/test_util.py and the decorator code is in src/rpi_hardware/util/decorators.py.
That is it for this one, hopefully it was helpful.
Part 4 of 9 in the Python Tips series.
Series Start | Python: Functions and Mutable Defaults | Python: Unders and Dunders