Implement an LRU Cache

Data Structures
Hard
Google
20.2K views

Design and implement a Least Recently Used (LRU) cache. It should support `get` and `put` operations in $O(1)$ time complexity. This typically requires a Doubly Linked List and a Hash Map.

Why Interviewers Ask This

Google asks this to evaluate your mastery of hybrid data structures and ability to optimize for strict time complexity constraints. They specifically test if you understand how to combine a Hash Map for O(1) lookups with a Doubly Linked List for O(1) reordering, ensuring you can design systems that handle high-frequency cache invalidation efficiently.

How to Answer This Question

1. Clarify requirements: Confirm capacity limits, behavior on full cache (eviction), and thread safety needs. 2. Propose the architecture: Explicitly state you will use a Hash Map storing keys and pointers to nodes in a Doubly Linked List. 3. Define node structure: Explain each node holds key, value, and prev/next pointers to maintain order. 4. Walk through 'get': Describe moving accessed items to the head via pointer manipulation and returning -1 if missing. 5. Walk through 'put': Detail updating existing values or inserting new nodes at the head while evicting the tail if capacity is exceeded. 6. Analyze complexity: Conclude by confirming both operations are O(1) due to direct hashing and constant-time list adjustments.

Key Points to Cover

  • Explicitly mention using a Hash Map for O(1) key lookup
  • Describe using a Doubly Linked List to maintain access order
  • Explain the logic for moving accessed nodes to the head
  • Detail the eviction strategy removing the tail node when capacity is reached
  • Confirm both get and put operations achieve strict O(1) time complexity

Sample Answer

To implement an LRU Cache with O(1) get and put operations, I will combine a Hash Map with a Doubly Linked List. The Hash Map will store keys mapping directly to the corresponding nodes in the linked list, allowing instant access. The Doubly Linked List will maintain the usage order, where the most recently used item sits at the head and the least recently used item sits at the tail. For the 'get' operation, if the key exists in the map, I retrieve the node, move it to the front of the list to mark it as recently used, and return its value. If the key is missing, I return -1. For 'put', if the key already exists, I update the value and move the node to the front. If the key is new, I create a new node and insert it at the head. If the cache exceeds its capacity after insertion, I remove the node at the tail, which represents the least recently used item, and delete its entry from the map. This approach ensures that every operation involves only constant-time pointer updates and hash lookups. This specific pattern is often seen in Google's infrastructure where low-latency access to frequently used data is critical, demonstrating my ability to balance memory efficiency with performance constraints effectively.

Common Mistakes to Avoid

  • Using a standard array or list for storage, resulting in O(n) search times instead of O(1)
  • Failing to update the doubly linked list pointers correctly during moves, causing broken chains
  • Neglecting to remove the old node from the Hash Map when overwriting a value
  • Ignoring edge cases like initializing an empty cache or handling single-element capacity limits

Practice This Question with AI

Answer this question orally or via text and get instant AI-powered feedback on your response quality, structure, and delivery.

Start Practicing

Related Interview Questions

Browse all 154 Data Structures questionsBrowse all 87 Google questions