Take off, I made an LRU cache by hand, the source code turned out to be so simple!

Take off, I made an LRU cache by hand, the source code turned out to be so simple!

LRU Introduction

LRU is the abbreviation of Least Recently Used, which is a commonly used page replacement algorithm that selects the pages that have not been used for the longest time for elimination.

Simply put, for a set of data, for example: int[] a = {1,2,3,4,5,6}, if the numbers 1 and 2 are often used, they will be ranked after 3,4,5,6, and the array becomes as follows: int[] a = {3,4,5,6,1,2}. If a number is not often used, it will be ranked first!

The LRU algorithm is generally used for querying hot data, such as news information. The more news is viewed by users, the more likely it is to be viewed by other users. For news that is rarely accessed, it is basically stored in the ocean!

In Java, there is such a collection class that implements this function, it is LinkedHashMap!

Introduction to LinkedHashMap

We all know that in Java collections, LinkedHashMap inherits from HashMap, and the underlying data structure is a doubly linked list. Unlike HashMap, LinkedHashMap has a parameter accessOrder in the initialization phase, which defaults to false.

 public class LinkedHashMap<K,V>
extends HashMap<K,V>
implements Map<K,V>{
/**Head node of a doubly linked list*/
transient LinkedHashMap.Entry<K,V> head;
/**The tail node of the doubly linked list*/
transient LinkedHashMap.Entry<K,V> tail;
/**
* 1. If accessOrder is true, the visited elements will be placed at the end of the linked list in the order of access.
* 2. If accessOrder is false, traverse in insertion order
*/
final boolean accessOrder;
}

If true is passed in, the most recently accessed elements will be placed at the end of the linked list in the order of access. The test is as follows:

 public static void main(String[] args) {
//accessOrder defaults to false
Map<String, String> accessOrderFalse = new LinkedHashMap<>();
accessOrderFalse.put("1","1");
accessOrderFalse.put("2","2");
accessOrderFalse.put("3","3");
accessOrderFalse.put("4","4");
System.out.println("acessOrderFalse:"+accessOrderFalse.toString());

//accessOrder is set to true
Map<String, String> accessOrderTrue = new LinkedHashMap<>(16, 0.75f, true);
accessOrderTrue.put("1","1");
accessOrderTrue.put("2","2");
accessOrderTrue.put("3","3");
accessOrderTrue.put("4","4");
accessOrderTrue.get("2"); //Get key 2
accessOrderTrue.get("3"); //Get key 3
System.out.println("accessOrderTrue:"+accessOrderTrue.toString());
}

The output is as follows:

 acessOrderFalse:{1=1, 2=2, 3=3, 4=4}
accessOrderTrue: {1=1, 4=4, 2=2, 3=3}

It can be seen that when we set accessOrder to true, frequently accessed elements will be placed in the front!

We use this feature to implement an LRU cache using LinkedHashMap as follows:

  • Create a LinkedHashMap object and set accessOrder to true;
  • Set the capacity of LinkedHashMap to n, and delete the extra elements if they exceed this value;
  • Rewrite the removeEldestEntry() method in LinkedHashMap;

Among them, removeEldestEntry() means that if it returns true, the element that has not been used recently will be removed. If it returns false, no operation will be performed. This method will be called every time add() is performed.

Create an LRU cache class with the following content:

 public class LRULinkedHashMap<K, V> extends LinkedHashMap<K, V> {

//Create a LinkedHashMap with a capacity of 3
private static final int MAX_SIZE = 3;

/**
* Rewrite the removeEldestEntry method in LinkedHashMap
* @param eldest
* @return
*/
protected boolean removeEldestEntry(Map.Entry eldest) {
//If the number of elements in the container is greater than MAX_SIZE, remove the most recently unused elements in the container each time an element is added
return size() > MAX_SIZE;
}

public LRULinkedHashMap() {
//Set the LinkedHashMap initialization capacity, load factor to 0.75f, accessOrder to true
super(MAX_SIZE, 0.75f, true);
}
}

Test use:

 public static void main(String[] args) {
LRULinkedHashMap<String,String> cache = new LRULinkedHashMap<String,String>();
cache.put("1","a");
cache.put("2","b");
cache.put("3","c");
System.out.println("Initial cache content: " + cache.toString());
cache.get("2");
System.out.println("After querying the element with key 2, the cache content is: " + cache.toString());
cache.put("4","d");
System.out.println("After adding new elements, cache content: " + cache.toString());
}

The output is as follows:

 Initial cache content: {1=a, 2=b, 3=c}
After searching for the element with key 2, the cache content is: {1=a, 3=c, 2=b}
After adding the new element, cache content: {3=c, 2=b, 4=d}

Initial cache content: {1=a, 2=b, 3=c} After querying the element with key 2, cache content: {1=a, 3=c, 2=b} After adding a new element, cache content: {3=c, 2=b, 4=d}

<<:  What are the deployments and arrangements for 5G in 2022? MIIT responds

>>:  People's Daily pays attention to the speed limit of network disk: Let the speed limit of network disk be increased

Recommend

Choosing the right communication mode for your IoT project

Before you embark on a new IoT project, you shoul...

DesiVPS: $3/month-2GB/20G SSD/2.5TB/San Jose & Netherlands Data Center

DesiVPS is an Indian VPS hosting provider headqua...

...

Interview Question Series: 12 Deadly Questions on Network

1. What is your understanding of the TCP/IP four-...

What exactly does the Communications Design Institute do?

Speaking of the Communications Design Institute, ...

The computing power network has its own calculations

In recent years, the wave of digitalization has c...

What is T568A wiring?

The T568A standard provides a uniform method for ...