Post

Created by @oliverrichards
 at November 8th 2023, 5:31:27 am.

Technical Interview Question: Implement an LRU (Least Recently Used) cache using a doubly linked list. A doubly linked list is a data structure in which each node has a reference to both its previous and next node. The LRU cache should support the following operations:

  1. Get: Retrieve the value of a given key from the cache. If the key does not exist, return -1.
  2. Put: Insert or update the value of a key-value pair in the cache. If the cache is already full, remove the least recently used item before inserting the new item.
  3. Display: Print the cache in the order of recently used, starting from the most recent.

Please write a solution that implements the LRU cache using a doubly linked list, and demonstrates its usage with some example inputs.

Answer: Here is the implementation of an LRU cache using a doubly linked list in Python:

class Node:
    def __init__(self, key, value):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None

class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}
        self.head = Node(0, 0)
        self.tail = Node(0, 0)
        self.head.next = self.tail
        self.tail.prev = self.head

    def get(self, key):
        if key in self.cache:
            node = self.cache[key]
            self._remove(node)
            self._add(node)
            return node.value
        else:
            return -1

    def put(self, key, value):
        if key in self.cache:
            self._remove(self.cache[key])
        node = Node(key, value)
        self._add(node)
        self.cache[key] = node
        if len(self.cache) > self.capacity:
            node_to_remove = self.head.next
            self._remove(node_to_remove)
            del self.cache[node_to_remove.key]

    def display(self):
        node = self.head.next
        while node != self.tail:
            print(f"{node.key}: {node.value}")
            node = node.next

    def _add(self, node):
        prev_node = self.tail.prev
        prev_node.next = node
        node.prev = prev_node
        self.tail.prev = node
        node.next = self.tail

    def _remove(self, node):
        prev_node = node.prev
        next_node = node.next
        prev_node.next = next_node
        next_node.prev = prev_node

Explanation:

  1. The Node class represents each node in the doubly linked list. Each node has a key, value, reference to the previous node (prev), and reference to the next node (next).

  2. The LRUCache class implements the LRU cache using the doubly linked list. It takes the capacity as a parameter in the constructor and initializes the empty cache dictionary, head and tail nodes of the doubly linked list.

  3. The get() method takes a key as a parameter and checks if the key exists in the cache. If the key exists, it moves the corresponding node to the tail of the doubly linked list (assuming it was recently used) and returns the value. If the key doesn't exist, it returns -1.

  4. The put() method takes a key and value as parameters and inserts or updates the key-value pair in the cache. If the key already exists in the cache, it removes the corresponding node from the doubly linked list and updates the cache. If the cache is full after the update, it removes the least recently used item (the node at the head) from the doubly linked list and the cache.

  5. The display() method prints the cache in the order of recently used by traversing the doubly linked list from the head's next node to the tail.

  6. The _add() method adds a given node to the tail of the doubly linked list by updating the references of the previous node and the tail.

  7. The _remove() method removes a given node from the doubly linked list by updating the references of the previous node and the next node.

To use this LRU cache implementation, you can create an instance of the LRUCache class with a specified capacity and call its methods to perform get(), put() and display() operations on the cache.