aboutsummaryrefslogtreecommitdiff
path: root/docs/index.rst
blob: 93dc71ce6d677a62df85e7066540b490caced08c (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
:mod:`cachetools` --- Extensible memoizing collections and decorators
=======================================================================

.. module:: cachetools

This module provides various memoizing collections and decorators,
including a variant of the Python 3 Standard Library
:func:`functools.lru_cache` function decorator.

.. code-block:: pycon

   >>> from cachetools import LRUCache
   >>> cache = LRUCache(maxsize=2)
   >>> cache.update([('first', 1), ('second', 2)])
   >>> cache
   LRUCache([('second', 2), ('first', 1)], maxsize=2, currsize=2)
   >>> cache['third'] = 3
   >>> cache
   LRUCache([('second', 2), ('third', 3)], maxsize=2, currsize=2)
   >>> cache['second']
   2
   >>> cache['fourth'] = 4
   LRUCache([('second', 2), ('fourth', 4)], maxsize=2, currsize=2)

For the purpose of this module, a *cache* is a mutable_ mapping_ of a
fixed maximum *size*.  When the cache is full, i.e. the current size
of the cache exceeds its maximum size, the cache must choose which
item(s) to discard based on a suitable `cache algorithm`_.

In general, a cache's size is the sum of the size of its items.  If
the size of each item is 1, a cache's size is equal to the number of
its items, i.e. `len(cache)`.  An items's size may also be a property
or function of its value, e.g. the result of :func:`sys.getsizeof`, or
:func:`len` for string and sequence values.

This module provides various cache implementations based on different
cache algorithms, as well as decorators for easily memoizing function
and method calls.


Cache Implementations
------------------------------------------------------------------------

This module provides several classes implementing caches using
different cache algorithms.  All these classes derive from class
:class:`Cache`, which in turn derives from
:class:`collections.MutableMapping`, providing additional properties
:attr:`maxsize` and :attr:`currsize` to retrieve the maximum and
current size of the cache.

:class:`Cache` also features a static method :meth:`getsizeof`, which
returns the size of a given item and may be overridden by subclasses.
The default implementation of :meth:`getsizeof` returns :const:`1`
irrespective of its `value` argument.  For convenience, all cache
classes also accept an optional constructor parameter `getsizeof`,
that may specify a function of one argument used to extract the size
of an item's value instead of the class' :meth:`getsizeof` method.


.. autoclass:: Cache
   :members:

.. autoclass:: LRUCache
   :members:

.. autoclass:: LFUCache
   :members:

.. autoclass:: RRCache
   :members:

.. autoclass:: TTLCache
   :members:

   Note that a cache item may expire at *any* time, so iterating over
   the items of a :class:`TTLCache` may raise :class:`KeyError` or
   :class:`RuntimeError` unexpectedly::

      from cachetools import TTLCache
      import time

      cache = TTLCache(maxsize=100, ttl=1)
      cache.update({1: 1, 2: 2, 3: 3})
      time.sleep(1)

      try:
          for key in cache:
              try:
                  print(cache[key])
              except KeyError:
                  print('Key %r has expired' % key)
      except RuntimeError as e:
          print(e)


Function Decorators
------------------------------------------------------------------------

This module provides several memoizing function decorators compatible
with --- though not necessarily as efficient as --- the Python 3
Standard Library :func:`functools.lru_cache` decorator.

In addition to a `maxsize` parameter, all decorators feature optional
arguments, which should be specified as keyword arguments for
compatibility with future extensions:

- `typed`, if is set to :const:`True`, will cause function arguments
  of different types to be cached separately.

- `getsizeof` specifies a function of one argument that will be
  applied to each cache value to determine its size.  The default
  value is :const:`None`, which will assign each item an equal size of
  :const:`1`.

- `lock` specifies a function of zero arguments that returns a
  `context manager`_ to lock the cache when necessary.  If not
  specified, :class:`threading.RLock` will be used to synchronize
  access from multiple threads.

The wrapped function is instrumented with :func:`cache_info` and
:func:`cache_clear` functions to provide information about cache
performance and clear the cache.  See the :func:`functools.lru_cache`
documentation for details.

Unlike :func:`functools.lru_cache`, setting `maxsize` to zero or
:const:`None` is not supported.

.. decorator:: lfu_cache(maxsize=128, typed=False, getsizeof=None, lock=threading.RLock)

   Decorator that wraps a function with a memoizing callable that
   saves up to `maxsize` results based on a Least Frequently Used
   (LFU) algorithm.

.. decorator:: lru_cache(maxsize=128, typed=False, getsizeof=None, lock=threading.RLock)

   Decorator that wraps a function with a memoizing callable that
   saves up to `maxsize` results based on a Least Recently Used (LRU)
   algorithm.

.. decorator:: rr_cache(maxsize=128, typed=False, getsizeof=None, lock=threading.RLock)

   Decorator that wraps a function with a memoizing callable that
   saves up to `maxsize` results based on a Random Replacement (RR)
   algorithm.


Method Decorators
------------------------------------------------------------------------

.. decorator:: cachedmethod(cache, typed=False)

   `cache` specifies a function of one argument that, when passed
   :const:`self`, will return a *cache* for the respective instance or
   class.  Multiple methods of an object or class may share the same
   cache.

   One advantage of the `@cachedmethod` decorator over the similar
   function decorators is that cache properties such as `maxsize` can
   be set at runtime::

     import operator
     import urllib.request

     from cachetools import LRUCache, cachedmethod

     class CachedPEPs(object):

       def __init__(self, cachesize):
         self.cache = LRUCache(maxsize=cachesize)

       @cachedmethod(operator.attrgetter('cache'))
       def get_pep(self, num):
         """Retrieve text of a Python Enhancement Proposal"""
         url = 'http://www.python.org/dev/peps/pep-%04d/' % num
         with urllib.request.urlopen(url) as s:
           return s.read()

     peps = CachedPEPs(cachesize=10)
     print("PEP #1: %s" % peps.get_pep(1))

   Note that no locking will be performed on the object returned by
   `cache(self)`, so dealing with concurrent access is entirely the
   responsibility of the user.


.. _mutable: http://docs.python.org/dev/glossary.html#term-mutable
.. _mapping: http://docs.python.org/dev/glossary.html#term-mapping
.. _cache algorithm: http://en.wikipedia.org/wiki/Cache_algorithms
.. _context manager: http://docs.python.org/dev/glossary.html#term-context-manager