LRU Cache Implementation

Algorithms
Hard
Uber
27.1K views

Design and implement a Least Recently Used (LRU) cache. It should support `get` and `put` operations in $O(1)$ time complexity using a HashMap and a Doubly Linked List.

Why Interviewers Ask This

Uber asks this to evaluate a candidate's ability to balance memory constraints with performance requirements in high-throughput systems. They specifically test mastery of pointer manipulation, data structure composition, and the rigorous application of O(1) complexity constraints. This problem reveals whether you can architect solutions that handle real-time ride-matching latency without degrading system stability under heavy load.

How to Answer This Question

First, clarify the cache capacity and expected behavior for existing keys during updates. Second, propose the dual-structure architecture: a HashMap for O(1) key lookup and a Doubly Linked List to maintain access order efficiently. Third, outline the logic for both operations separately. For 'get', explain moving the accessed node to the head and returning the value. For 'put', describe adding new nodes to the head or updating existing ones, followed by evicting the tail if capacity is exceeded. Fourth, walk through a specific scenario, such as inserting three items into a size-two cache, to demonstrate the eviction logic visually. Finally, analyze the time and space complexity, explicitly confirming O(1) for both operations and discussing edge cases like zero capacity or null inputs.

Key Points to Cover

  • Explicitly state the O(1) time complexity requirement before writing code
  • Demonstrate clear understanding of how the HashMap and Doubly Linked List interact
  • Correctly handle the edge case of updating an existing key versus inserting a new one
  • Show precise pointer manipulation when moving nodes to the head or removing the tail
  • Discuss space complexity and how the solution scales with cache capacity

Sample Answer

To implement an LRU Cache with O(1) operations, I will combine a HashMap and a Doubly Linked List. The HashMap maps keys directly to their corresponding list nodes, enabling instant access. The Doubly Linked List maintains the usage order, where the most recently used item sits at the head and the least recently used item sits at the tail. For the get operation, if the key exists, I retrieve the node from the map, move it to the head to mark it as recently used, and return its value. If not found, I return -1. For the put operation, if the key exists, I update the value and move the node to the head. If it's a new key, I create a new node, add it to the head, and insert it into the map. Crucially, after every insertion, I check if the cache size exceeds capacity. If it does, I remove the tail node from the list and delete its entry from the map. This ensures the least recently used item is always discarded first. Let's trace this with a capacity of 2. Insert (1, 1): List is [1]. Insert (2, 2): List is [2, 1]. Get(1): Returns 1, moves 1 to front, List becomes [1, 2]. Insert (3, 3): Evicts 2, List becomes [3, 1]. This approach guarantees constant time lookups and updates, which is critical for Uber's real-time dispatching algorithms.

Common Mistakes to Avoid

  • Using a standard List or Array instead of a Doubly Linked List, resulting in O(n) search times for moving elements
  • Failing to update the pointers correctly when moving a node to the head, causing list corruption
  • Not checking if the cache has reached capacity before inserting a new element, leading to memory leaks
  • Ignoring the scenario where the input capacity is zero or negative, causing runtime errors

Practice This Question with AI

Answer this question orally or via text and get instant AI-powered feedback on your response quality, structure, and delivery.

Start Practicing

Related Interview Questions

Browse all 145 Algorithms questionsBrowse all 57 Uber questions