LRU Cache Implementation in Java

LRU Cache Java

What is LRU Cache?

LRU Cache stands for Least Recently Used Cache. The size of the cache is fixed and it supports get() and put() operations. When the cache is full, the put() operation removes the least recently used cache.

How to implement LRU Cache in Java?

The LRU cache can be implemented in Java using two data structures – HashMap and a doubly-linked list to store the data.

The idea is to always have the elements in the following order.

Least Recently Used (LRU) -> A <-> B <-> C <-> D <-> E <- Most Recently Used (MRU)

Here is the LRUCache implementation in Java.


package com.journaldev.examples;

import java.util.HashMap;
import java.util.Map;

public class LRUCache<K, V> {

	private Node<K, V> lru;
	private Node<K, V> mru;
	private Map<K, Node<K, V>> container;
	private int capacity;
	private int currentSize;

	public LRUCache(int capacity) {
		this.capacity = capacity;
		this.currentSize = 0;
		lru = new Node<K, V>(null, null, null, null);
		mru = lru;
		container = new HashMap<K, Node<K, V>>();
	}

	public V get(K key) {
		Node<K, V> tempNode = container.get(key);
		if (tempNode == null) {
			return null;
		}
		// If MRU leave the list as it is
		else if (tempNode.key == mru.key) {
			return mru.value;
		}

		// Get the next and prev nodes
		Node<K, V> nextNode = tempNode.next;
		Node<K, V> prevNode = tempNode.prev;

		// If at the left-most, we update LRU
		if (tempNode.key == lru.key) {
			nextNode.prev = null;
			lru = nextNode;
		}

		// If we are in the middle, we need to update the items before and after our
		// item
		else if (tempNode.key != mru.key) {
			prevNode.next = nextNode;
			nextNode.prev = prevNode;
		}

		// Finally move our item to the MRU
		tempNode.prev = mru;
		mru.next = tempNode;
		mru = tempNode;
		mru.next = null;

		return tempNode.value;

	}

	public void put(K key, V value) {
		if (container.containsKey(key)) {
			return;
		}

		// Put the new node at the right-most end of the linked-list
		Node<K, V> myNode = new Node<K, V>(mru, null, key, value);
		mru.next = myNode;
		container.put(key, myNode);
		mru = myNode;

		// Delete the left-most entry and update the LRU pointer
		if (currentSize == capacity) {
			container.remove(lru.key);
			lru = lru.next;
			lru.prev = null;
		}

		// Update container size, for the first added entry update the LRU pointer
		else if (currentSize < capacity) {
			if (currentSize == 0) {
				lru = myNode;
			}
			currentSize++;
		}
	}

	// Node for doubly linked list
	class Node<T, U> {
		T key;
		U value;
		Node<T, U> prev;
		Node<T, U> next;

		public Node(Node<T, U> prev, Node<T, U> next, T key, U value) {
			this.prev = prev;
			this.next = next;
			this.key = key;
			this.value = value;
		}
	}

}

Some important points about the implementation.

  • The Node class is used to implement doubly linked list structure. It has references to the previous and next node.
  • When we are getting a value using the get() method, the element is moved to the rightmost position because it becomes MRU element.
  • When we are putting an element in the cache, it’s added to the rightmost side. If the container is full, then the LRU element is removed.
  • We are using Generics so that we can use the implementation with any types of data.

LRUCache Test Program

I am using JUnit test cases to run some scenarios from the LeetCode question.


package com.journaldev.examples;

import static org.junit.jupiter.api.Assertions.assertEquals;

import org.junit.jupiter.api.Test;

class LRUCacheTest {

	private LRUCache<Integer, Integer> c;

	public LRUCacheTest() {
		this.c = new LRUCache<>(2);
	}

	@Test
	public void testCacheStartsEmpty() {
		assertEquals(c.get(1), null);
	}

	@Test
	public void testSetBelowCapacity() {
		c.put(1, 1);
		assertEquals(c.get(1), 1);
		assertEquals(c.get(2), null);
		c.put(2, 4);
		assertEquals(c.get(1), 1);
		assertEquals(c.get(2), 4);
	}

	@Test
	public void testCapacityReachedOldestRemoved() {
		c.put(1, 1);
		c.put(2, 4);
		c.put(3, 9);
		assertEquals(c.get(1), null);
		assertEquals(c.get(2), 4);
		assertEquals(c.get(3), 9);
	}

	@Test
	public void testGetRenewsEntry() {
		c.put(1, 1);
		c.put(2, 4);
		assertEquals(c.get(1), 1);
		c.put(3, 9);
		assertEquals(c.get(1), 1);
		assertEquals(c.get(2), null);
		assertEquals(c.get(3), 9);
	}

}
LRUCache Test

LRUCache Test

Conclusion

There are many ways to implement LRU Cache. But, if we keep the cache elements in the order of their usage, then we can get better performance in the put() operation.

References

Leave a Reply

Your email address will not be published. Required fields are marked *

close
Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages