LRU (Least Recently Used) is a caching algorithm used in computer memory management, where the least recently used items are discarded first. In other words, it maintains a cache of a fixed size and when the cache is full, the least recently used item is removed to make room for new items.
LRU is a popular caching algorithm because it takes into account the fact that recently accessed items are more likely to be accessed again in the near future. By keeping the most recently accessed items in the cache, it reduces the number of cache misses and improves the performance of the system.
The implementation of LRU involves keeping track of the order in which items are accessed and using that order to determine which item to remove when the cache is full. This can be done using various data structures, such as linked lists, hash tables or priority queues.
Overall, LRU is a simple but effective caching algorithm that can improve the performance of memory-intensive applications and reduce the number of expensive memory accesses.
Here’s a basic algorithm for LRU (Least Recently Used) caching:
- Initialize an empty cache of a fixed size.
- When a new item is requested, check if it is already in the cache. If it is, move it to the front of the cache (since it is now the most recently used item).
- If the item is not in the cache, add it to the front of the cache and check if the cache has exceeded its capacity. If it has, remove the least recently used item (which is at the end of the cache).
- Repeat steps 2-3 for each new item requested.
Here’s how you can implement an LRU cache in Kotlin:
class LRUCache<K, V>(private val capacity: Int) : LinkedHashMap<K, V>(capacity, 0.75f, true) {
override fun removeEldestEntry(eldest: Map.Entry<K, V>?): Boolean {
return size > capacity
}
}
In the above code, we are extending the LinkedHashMap
class, which is a map-based implementation of the Map
interface with predictable iteration order. We are overriding the removeEldestEntry
method to remove the least recently used entry when the size of the map exceeds the capacity.
class LRUCache<K, V>(private val capacity: Int) : LinkedHashMap<K, V>(capacity, 0.75f, true) {
override fun removeEldestEntry(eldest: Map.Entry<K, V>?): Boolean {
return size > capacity
}
}
fun main(){
val cache = LRUCache<String, Int>(3)
cache["one"] = 1
cache["two"] = 2
cache["three"] = 3
// The cache is full, so "one" is the least recently used entry and should be removed
cache["four"] = 4
// The "one" entry is no longer in the cache
println(cache["one"]) // null
// The "two" entry is still in the cache
println(cache["two"]) // 2
}
In the example above, we are creating an LRU cache with a capacity of 3 and adding four entries to the cache. When we add the fourth entry, “one” is removed because it was the least recently used entry. Finally, we are checking that “one” is no longer in the cache and “two” is still in the cache.
Output : Null 2