LRU Cache Implementation
Design and implement a Least Recently Used (LRU) cache. It should support `get` and `put` operations in $O(1)$ time complexity using a HashMap and a Doubly Linked List.
Why Interviewers Ask This
Uber asks this to evaluate a candidate's ability to balance memory constraints with performance requirements in high-throughput systems. They specifically test mastery of pointer manipulation, data structure composition, and the rigorous application of O(1) complexity constraints. This problem reveals whether you can architect solutions that handle real-time ride-matching latency without degrading system stability under heavy load.
How to Answer This Question
Key Points to Cover
- Explicitly state the O(1) time complexity requirement before writing code
- Demonstrate clear understanding of how the HashMap and Doubly Linked List interact
- Correctly handle the edge case of updating an existing key versus inserting a new one
- Show precise pointer manipulation when moving nodes to the head or removing the tail
- Discuss space complexity and how the solution scales with cache capacity
Sample Answer
Common Mistakes to Avoid
- Using a standard List or Array instead of a Doubly Linked List, resulting in O(n) search times for moving elements
- Failing to update the pointers correctly when moving a node to the head, causing list corruption
- Not checking if the cache has reached capacity before inserting a new element, leading to memory leaks
- Ignoring the scenario where the input capacity is zero or negative, causing runtime errors
Practice This Question with AI
Answer this question orally or via text and get instant AI-powered feedback on your response quality, structure, and delivery.