c++algorithmdata-structureslru

LRU cache design


Least Recently Used (LRU) Cache is to discard the least recently used items first How do you design and implement such a cache class? The design requirements are as follows:

1) find the item as fast as we can

2) Once a cache misses and a cache is full, we need to replace the least recently used item as fast as possible.

How to analyze and implement this question in terms of design pattern and algorithm design?


Solution

  • A linked list + hashtable of pointers to the linked list nodes is the usual way to implement LRU caches. This gives O(1) operations (assuming a decent hash). Advantage of this (being O(1)): you can do a multithreaded version by just locking the whole structure. You don't have to worry about granular locking etc.

    Briefly, the way it works:

    On an access of a value, you move the corresponding node in the linked list to the head.

    When you need to remove a value from the cache, you remove from the tail end.

    When you add a value to cache, you just place it at the head of the linked list.

    Thanks to doublep, here is site with a C++ implementation: Miscellaneous Container Templates.