Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Decorators caching - no cache

If we have a function that for a given set of parameters always returns the same result (so no randomness, no time dependency, no persistent part) then we might be able to trade some memory to gain some speed. We could use a cache to remember the result the first time we call a functions and return the same result without doing the computation for every subsequent call.

First let’s see a case without cache. Each call will execute the function and do the (expensive) computation.


def compute(x, y):
    print(f"Called with {x} and {y}")
    # some long computation here
    return x+y

if __name__ == '__main__':
    print(compute(2, 3))
    print(compute(3, 4))
    print(compute(2, 3))

Called with 2 and 3
5
Called with 3 and 4
7
Called with 2 and 3
5
from no_cache import compute

def test_compute(capsys):
    assert compute(2, 3) == 5
    out, err = capsys.readouterr()
    assert err == ''
    assert out == 'Called with 2 and 3\n'

    assert compute(3, 4) == 7
    out, err = capsys.readouterr()
    assert err == ''
    assert out == 'Called with 3 and 4\n'

    assert compute(2, 3) == 5
    out, err = capsys.readouterr()
    assert err == ''
    assert out == 'Called with 2 and 3\n'