returnn.util.lru_cache
¶
lru_cache()
, copied from Python functools, slightly adapted,
and extended by functions to check whether some key is cached or not.
- returnn.util.lru_cache.lru_cache(maxsize: int = 128, typed: bool = False)[source]¶
Least-recently-used cache decorator.
If maxsize is set to None, the LRU features are disabled and the cache can grow without bound.
If typed is True, arguments of different types will be cached separately. For example, f(3.0) and f(3) will be treated as distinct calls with distinct results.
Arguments to the cached function must be hashable.
Use f.cache_len() to see the current size of the cache. Use f.cache_set(*args, result, **kwargs) to set a value in the cache directly. Use f.cache_peek(*args, update_statistics=False, fallback=None, **kwargs) to peek the cache, without ever calling the user function. View the cache statistics named tuple (hits, misses, maxsize, currsize) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). Remove the oldest entry from the cache with f.cache_pop_oldest(). Take out some entry from the cache with f.cache_pop(*args, fallback=not_specified, **kwargs). Set the maximum cache size to a new value with f.cache_set_maxsize(new_maxsize). Access the underlying function with f.__wrapped__.
See: https://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU)