Introduction
The Least Recently Used (LRU) cache is a classic problem combining fast lookup and eviction strategy. You are asked to design a data structure that behaves like a cache:
- Retrieve an item in O(1) time.
- Insert or update an item in O(1) time.
- Evict the least recently used item when the cache exceeds capacity.
This problem tests your knowledge of data structure combinations, especially hash maps and doubly linked lists.
Problem Statement
Design a data structure that follows the LRU (Least Recently Used) caching strategy.
Implement the LRUCache
class:
go
LRUCache(int capacity) // Initialize the LRU cache with positive size capacity
Get(key int) int // Return the value if key exists, else -1
Put(key int, value int) // Update value if key exists, otherwise insert the key-value pair. If capacity is full, evict LRU.
Solution Design
We use two data structures:
- Hash Map: For O(1) access to cache entries.
- Doubly Linked List: To track the usage order (most recently used to least recently used).
How It Works
- New items are added to the front (most recently used).
- When accessed (
Get
), the node is moved to the front.
- When full and a new item is added, the last node (least recently used) is evicted.
Go Implementation
go
package main
type Node struct {
key, value int
prev, next *Node
}
type LRUCache struct {
capacity int
cache map[int]*Node
head *Node
tail *Node
}
func Constructor(capacity int) LRUCache {
head := &Node{}
tail := &Node{}
head.next = tail
tail.prev = head
return LRUCache{
capacity: capacity,
cache: make(map[int]*Node),
head: head,
tail: tail,
}
}
func (this *LRUCache) Get(key int) int {
if node, exists := this.cache[key]; exists {
this.moveToFront(node)
return node.value
}
return -1
}
func (this *LRUCache) Put(key int, value int) {
if node, exists := this.cache[key]; exists {
node.value = value
this.moveToFront(node)
} else {
if len(this.cache) == this.capacity {
this.removeLRU()
}
newNode := &Node{key: key, value: value}
this.cache[key] = newNode
this.addToFront(newNode)
}
}
// --- Helper Methods ---
func (this *LRUCache) moveToFront(node *Node) {
this.removeNode(node)
this.addToFront(node)
}
func (this *LRUCache) addToFront(node *Node) {
node.prev = this.head
node.next = this.head.next
this.head.next.prev = node
this.head.next = node
}
func (this *LRUCache) removeNode(node *Node) {
prev := node.prev
next := node.next
prev.next = next
next.prev = prev
}
func (this *LRUCache) removeLRU() {
lru := this.tail.prev
this.removeNode(lru)
delete(this.cache, lru.key)
}
Time and Space Complexity
OperationTimeSpaceGet
O(1)O(capacity)Put
O(1)O(capacity)
Example Usage
go
lru := Constructor(2)
lru.Put(1, 1)
lru.Put(2, 2)
lru.Get(1) // returns 1
lru.Put(3, 3) // evicts key 2
lru.Get(2) // returns -1
lru.Put(4, 4) // evicts key 1
lru.Get(1) // returns -1
lru.Get(3) // returns 3
lru.Get(4) // returns 4
Conclusion
The LRU Cache problem demonstrates how to manage data with limited memory and fast access time using efficient data structures. Understanding this implementation is critical for performance-sensitive applications like browsers, memory caches, and system-level design.