We use cookies (including Google cookies) to personalize ads and analyze traffic. By continuing to use our site, you accept our Privacy Policy.

LRU Cache

Number: 146

Difficulty: Medium

Paid? No

Companies: Amazon, Meta, Microsoft, Google, Bloomberg, Apple, Oracle, TikTok, ByteDance, Confluent, LinkedIn, PayPal, eBay, Uber, Visa, Palo Alto Networks, Cisco, Samsung, Walmart Labs, Nvidia, Netflix, Yandex, Coupang, Salesforce, Intuit, Goldman Sachs, Arista Networks, Nutanix, DoorDash, ServiceNow, Shopify, Reddit, Squarespace, Citadel, Snap, NetApp, smartnews, Freecharge, KLA, Yahoo, SAP, Splunk, Docusign, Shopee, Booking.com, Disney, Rivian, Roku, FreshWorks, Myntra, Nordstrom, ThousandEyes, Whatfix, Verkada, Adobe, Rubrik, Tripadvisor, Qualcomm, Cruise, Tesla, carwale, VMware, Morgan Stanley, MakeMyTrip, Swiggy, IBM, ZScaler, Cashfree, MongoDB, Nokia, General Motors, Optiver, Snowflake, Palantir Technologies, X, Zenefits


Problem Description

Design an LRU (Least Recently Used) Cache that supports constant time operations for getting and putting key-value pairs. The cache should evict the least recently used item when its capacity is exceeded.


Key Insights

  • Use a Hash Table (dictionary) to allow O(1) access to nodes by key.
  • Use a Doubly Linked List to record the usage order. The head represents the most recently used element while the tail represents the least recently used.
  • When a key is accessed or updated, move the corresponding node to the front (head) of the list.
  • When the cache exceeds capacity, remove the node at the tail end which is the least recently used entry.

Space and Time Complexity

Time Complexity: O(1) per get and put operation.
Space Complexity: O(capacity), storing key-node pairs and linked list nodes.


Solution

The solution combines a hash table (or dictionary) and a doubly linked list:

  • The hash table stores keys and corresponding node addresses.
  • The doubly linked list maintains the order of usage with the most recently used items near the head.
  • For get(key), check the dictionary. If found, move the node to the head and return its value; if not, return -1.
  • For put(key, value), if the key exists, update its value and move it to the head. If not, create a new node and add it right after the head. If the capacity is reached, remove the tail node (least recently used) and update the dictionary accordingly.
  • Dummy head and tail nodes are used to simplify node addition and removal procedures without worrying about edge cases.

Code Solutions

# Define a node for the doubly linked list.
class Node:
    def __init__(self, key, value):
        self.key = key           # key of the node
        self.value = value       # value of the node
        self.prev = None         # pointer to previous node
        self.next = None         # pointer to next node

class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity                # maximum capacity of the cache
        self.cache = {}                         # dictionary mapping key -> node
        # Create dummy head and tail nodes.
        self.head = Node(0, 0)
        self.tail = Node(0, 0)
        self.head.next = self.tail
        self.tail.prev = self.head

    # Helper function to remove a node from the linked list.
    def _remove(self, node):
        prev_node = node.prev
        next_node = node.next
        prev_node.next = next_node
        next_node.prev = prev_node

    # Helper function to add a node right after the head.
    def _add(self, node):
        node.prev = self.head
        node.next = self.head.next
        self.head.next.prev = node
        self.head.next = node

    # Returns the value of the key if it exists; otherwise, returns -1.
    def get(self, key):
        if key in self.cache:
            node = self.cache[key]
            # Move the accessed node to the head (marking it as recently used)
            self._remove(node)
            self._add(node)
            return node.value
        return -1

    # Updates the value of the key if it exists; otherwise, adds the key-value pair.
    # Removes the least recently used key if the cache exceeds its capacity.
    def put(self, key, value):
        if key in self.cache:
            # Remove the old node.
            self._remove(self.cache[key])
        node = Node(key, value)
        self._add(node)
        self.cache[key] = node
        if len(self.cache) > self.capacity:
            # Remove the least recently used element.
            lru = self.tail.prev
            self._remove(lru)
            del self.cache[lru.key]
← Back to All Questions