#include <genesis/utils/containers/mru_cache.hpp>
Most Recently Used Cache.
The class offers a cache that maps from unique keys of type Key
to values of type T
, automatically loading elements as needed using the load_function. The cache only keeps a certain number of elements. That is, the elements that were least recently used are removed to avoid exceeding the capacity().
The main functions are fetch() and fetch_copy(), which retrieve an element from the cache, potentially loading it first, and potentially removing old elements. During a fetch, it is possible that one more element is kept in memory than the capacity allows, before removing the oldest one. This is done in order to recover from a load that failed (with an exception) without altering the state of the cache.
In order to use this class, the functor load_function has to be set before calling fetch(), fetch_copy() or touch(). The functor needs to take a const reference to the key type Key
and return a mapped type T
by value.
Example:
Caching large files in a multi-threaded environment is a main use case of this class. In order to avoid fetching actual copies of the data each time, std::shared_ptr
can be used. See fetch_copy() for details.
Lastly, a second functor release_function can be used to specify a function that is exectued before an element is removed from the cache. If this functor is not set, the element is simply removed without any prior call.
Thread safety: As fetch() returns by reference, it is not thread safe. A fetched element can be removed at any time when other threads invoke fetch() again, leaving to a dangling reference. Thus, for multi-threaded use, fetch_copy() can be used, which is guarded and returns a copy of the elment instead. See there for details.
Definition at line 100 of file mru_cache.hpp.
Public Member Functions | |
MruCache ()=default | |
Default constructor. Uses a capacity() of 0, that is, limitless. More... | |
MruCache (MruCache &&)=default | |
MruCache (MruCache const &)=delete | |
MruCache (size_t capacity) | |
Construct a cache with a given capacity . More... | |
~MruCache () | |
iterator | begin () |
Get an iterator to the begin of the cache, that is, the most recently used element. More... | |
const_iterator | begin () const |
Get a const iterator to the begin of the cache, that is, the most recently used element. More... | |
size_t | cache_hits () const |
Number of cache hits. More... | |
size_t | cache_misses () const |
Number of cache misses. More... | |
size_type | capacity () const |
Return the currently set capacity of the cache, i.e., its maximal size. More... | |
void | capacity (size_t value) |
Set the capacity of the cache, i.e., its maximal size. More... | |
const_iterator | cbegin () |
Get a const iterator to the begin of the cache, that is, the most recently used element. More... | |
const_iterator | cend () |
Get a const iterator to past-the-end of the cache, that is, one element after the least recently used one. More... | |
void | clear () |
Clear the cache. More... | |
bool | contains (key_type const &key) |
Return whether an element is currently in the cache. More... | |
bool | empty () const |
Return whether the cache is currently empty, that is, has no elements loaded. More... | |
iterator | end () |
Get an iterator to past-the-end of the cache, that is, one element after the least recently used one. More... | |
const_iterator | end () const |
Get a const iterator to past-the-end of the cache, that is, one element after the least recently used one. More... | |
mapped_type & | fetch (key_type const &key) |
Get an element. More... | |
mapped_type | fetch_copy (key_type const &key) |
Get an element by copy. More... | |
MruCache & | operator= (MruCache &&)=default |
MruCache & | operator= (MruCache const &)=delete |
size_type | size () const |
Get the current count of elements being loaded in the cache. More... | |
void | touch (key_type const &key) |
Bring an element to the front, and load it if needed. More... | |
Public Types | |
using | const_iterator = typename std::list< value_type >::const_iterator |
using | iterator = typename std::list< value_type >::iterator |
using | key_type = Key |
using | mapped_type = T |
using | size_type = size_t |
using | value_type = std::pair< const key_type, mapped_type > |
Public Attributes | |
std::function< mapped_type(key_type const &)> | load_function |
Function to load an element into the cache if it is being fetched but not there yet. More... | |
std::function< void(key_type const &, mapped_type &)> | release_function |
Function to be called when an element is removed from the cache. More... | |
|
default |
Default constructor. Uses a capacity() of 0, that is, limitless.
A capacity of 0 means limitless, meaning that no elements are ever removed.
|
inlineexplicit |
Construct a cache with a given capacity
.
A capacity of 0 means limitless, meaning that no elements are ever removed.
Definition at line 135 of file mru_cache.hpp.
|
inline |
Definition at line 139 of file mru_cache.hpp.
|
inline |
Get an iterator to the begin of the cache, that is, the most recently used element.
Definition at line 158 of file mru_cache.hpp.
|
inline |
Get a const iterator to the begin of the cache, that is, the most recently used element.
Definition at line 166 of file mru_cache.hpp.
|
inline |
Number of cache hits.
Returns the number of times that an element could be retrieved without calling the load function.
Definition at line 243 of file mru_cache.hpp.
|
inline |
Number of cache misses.
This includes the (probably rare) cases where a multithreaded access led to the same element being loaded multiple times. That is, this counter returns the number of times that the load function was called.
Definition at line 255 of file mru_cache.hpp.
|
inline |
Return the currently set capacity of the cache, i.e., its maximal size.
A capacity of 0 means limitless, that is, no elements are ever removed.
Definition at line 223 of file mru_cache.hpp.
|
inline |
Set the capacity of the cache, i.e., its maximal size.
Setting the capacity to 0 means limitless, that is, no elements are ever removed.
Definition at line 445 of file mru_cache.hpp.
|
inline |
Get a const iterator to the begin of the cache, that is, the most recently used element.
Definition at line 192 of file mru_cache.hpp.
|
inline |
Get a const iterator to past-the-end of the cache, that is, one element after the least recently used one.
Definition at line 200 of file mru_cache.hpp.
|
inline |
Clear the cache.
Definition at line 460 of file mru_cache.hpp.
|
inline |
Return whether an element is currently in the cache.
Thread safety: Safe. But the element might be removed by other threads soon.
Definition at line 415 of file mru_cache.hpp.
|
inline |
Return whether the cache is currently empty, that is, has no elements loaded.
Definition at line 232 of file mru_cache.hpp.
|
inline |
Get an iterator to past-the-end of the cache, that is, one element after the least recently used one.
Definition at line 175 of file mru_cache.hpp.
|
inline |
Get a const iterator to past-the-end of the cache, that is, one element after the least recently used one.
Definition at line 184 of file mru_cache.hpp.
|
inline |
Get an element.
This is the main function of the class. It gets an element given its key
, either by retrieving it from the cache, or loading it into the cache first, if needed.
If loading an element leads to the capacity of the cache begin exceeded, the least recently used element is removed. The removal is done after loading the new element. This means that the memory usage can be one more element than the capacity() allows. This is done to make sure that an exception thrown when loading the new element does not lead to the cache being altered.
Thread safety: Not thread safe, because it does not use a guard, and because it returns a reference, which can become dangling if other threads fetch new elements, leading to the referenced one being removed. For multi-threaded use, see fetch_copy().
Caveat: Even in single-thread use, a variable storing a reference obtained from fetch() can become dangling, if more new elements are fetched or touched than the capacity allows. Thus, the variable needs to go out of scope before this happens. For example, a loop over keys, fetching an element in the beginning of the loop body and keeping the reference only within the loop body without calling fetch() again, is fine.
Definition at line 286 of file mru_cache.hpp.
|
inline |
Get an element by copy.
This works exactly the same as fetch(), but is thread safe and returns a copy. See fetch() for details.
Because the loading is not part of the mutex that makes this function thread safe, it is possible to parallely load elements in different threads. However, when two threads need to load an element at the same time, the loading may happen twice. Then, only the first thread that finishes loading stores the element in the cache, while the other one is discarded. This is done in order to allow parallel loading without the hassle of per-element locks.
If the cache is used in a multi-threaded environment and holds large elements, making actual copies might be too expensive. In that case, a neat trick is to store shared pointers to the elements instead:
As the control block of std::shared_ptr
is thread safe, these shared pointer copies can stay alive in a thread that still needs the element, even if the element was removed from the cache by other threads in the meantime.
Definition at line 352 of file mru_cache.hpp.
|
inline |
Get the current count of elements being loaded in the cache.
Definition at line 212 of file mru_cache.hpp.
|
inline |
Bring an element to the front, and load it if needed.
The function behaves just like fetch_copy(), but without returning the element. Useful to pre-load the cache.
Be aware however that having touched an element in multi threaded used does not guarantee that it stays in the cache for long. Other threads might have fetched other elements, leading to the removal of the touched one. In that case, it has to be loaded again when fetched later.
Definition at line 431 of file mru_cache.hpp.
using const_iterator = typename std::list< value_type >::const_iterator |
Definition at line 113 of file mru_cache.hpp.
using iterator = typename std::list< value_type >::iterator |
Definition at line 112 of file mru_cache.hpp.
using key_type = Key |
Definition at line 108 of file mru_cache.hpp.
using mapped_type = T |
Definition at line 109 of file mru_cache.hpp.
using size_type = size_t |
Definition at line 115 of file mru_cache.hpp.
using value_type = std::pair<const key_type, mapped_type> |
Definition at line 110 of file mru_cache.hpp.
std::function< mapped_type( key_type const& )> load_function |
Function to load an element into the cache if it is being fetched but not there yet.
Has to be set before calling fetch(), fetch_copy() or touch(), otherwise an exception is thrown.
Definition at line 543 of file mru_cache.hpp.
std::function< void( key_type const&, mapped_type& )> release_function |
Function to be called when an element is removed from the cache.
This function is called whenever elements are removed from the cache, e.g., due to being the least recently used one, due to a call to clear(), or when the destructor is called.
Can be empty. In that case, nothing is called when elements are removed.
Definition at line 553 of file mru_cache.hpp.