I am solving leetcode LRU design problem - Leetcode LRU:
Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.
Implement the LRUCache class:
LRUCache(int capacity)
Initialize the LRU cache with positive sizecapacity
.int get(int key)
Return the value of thekey
if thekey
exists, otherwise return -1.void put(int key, int value)
Update the value of thekey
if thekey
exists. Otherwise, add thekey-value
pair to the cache. If the number of keys exceeds thecapacity
from this operation, evict the least recently used key.The functions
get
andput
must each run in O(1) average time complexity.
I designed it with using Queue and HashMap, and I was able to pass 20 out of 22 test cases. However, the remaining test cases are timing out.
On searching, I found that a doubly linked list is the best way to implement it. I am curious as why queue and hash map is timing out and why a doubly linked list is the best way to solve this.
Below is my implementation:
class LRUCache {
int capacity=0;
BlockingQueue<Integer> queue;
Map<Integer, Integer> map = new HashMap<>();
public LRUCache(int capacity) {
this.capacity = capacity;
queue = new ArrayBlockingQueue<Integer>(capacity);
}
public int get(int key) {
if(queue.contains(key)){
queue.remove(key);
queue.add(key);
return map.get(key);
}
else
return -1;
}
public void put(int key, int value) {
if(queue.contains(key)){
queue.remove(key);
queue.add(key);
map.put(key, value);
}
else if(queue.size()<capacity){
queue.add(key);
map.put(key,value);
}
else{
int oldKey = queue.remove();
map.remove(oldKey);
queue.add(key);
map.put(key,value);
}
}
}
The result is as shown below:
The method calls queue.remove(key)
and queue.contains(key)
do not have O(1) time complexity. See the documentation on ArrayBlockingQueue which mentions that this queue is "...backed by an array", i.e. it needs to scan the array to locate a given value. This has O(𝑛) time complexity. This is enough reason not to use it for this challenge. On top of that, the operations use locking to avoid concurrency problems which make them even slower. The documentation on remove
has:
remove
public boolean remove(Object o)
[...] Removal of interior elements in circular array based queues is an intrinsically slow and disruptive operation, so should be undertaken only in exceptional circumstances, ideally only when the queue is known not to be accessible by other threads.
You mention doubly linked lists: Java even has a doubly linked list implementation that is combined with a hash map: a LinkedHashMap
.
The documentation mentions:
A special constructor is provided to create a linked hash map whose order of iteration is the order in which its entries were last accessed, from least-recently accessed to most-recently (access-order). This kind of map is well-suited to building LRU caches.
And that is exactly what you need here. The implementation could be:
class LRUCache extends LinkedHashMap<Integer, Integer> {
int capacity;
public LRUCache(int capacity) {
// Foresee one more than desired capacity, so no extension is needed
// when we allow a temporary overrun before deleting the eldest entry
super(capacity + 1, 1, true); // true will enable the LRU behavior
this.capacity = capacity;
}
// This method is called internally by put, getOrDefault (and similar).
// See documentation
protected boolean removeEldestEntry(Map.Entry<Integer, Integer> entry) {
return this.size() > this.capacity; // overrun detected: ask for removal
}
public int get(int key) {
return getOrDefault(key, -1);
}
}