You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
/strat contains an approximate implementation of the Clock algorithm, which increments
frequency on cache-hits and decrements frequency of items in the same 'clock' where a newer
entry seeks admission into, which helps avoid starvation. Another algorithm, O1 implements
a constant-time LFU-LRU hybrid cache, but the flip side is it consumes extra memory compared
to Clock, though is very much simpler to reason about.
/ds contains implementations of underlying stores supporting the cache: A HashMap backed
by the native Map impl, and a restrictive RangeList backed by a Skip List.
That is, Clock.js, MultiClock.js, O1.js instances can be backed by either HashMap
for point queries (takes ~500ms for 1M point-queries), or by RangeList for range queries
(takes ~5000ms for 1M range queries; see the perf workflow).
lfu.js serves as the entrypoint to construct and interact with these LFUs.
import{LfuCache,RangeLfu}from"@serverless-dns/lfu.js";constlfu=newLfuCache("L1",10)lfu.put(1,"a")// 1 -> "a"constv=lfu.get(1)// v = "a"constrgcache=newRangeLfu("R1",10)rgcache.put(1,10,"a")// (1, 10) -> "a"constv=rgcache.get(5)// v = "a"
All caches are magic. Knowing their mechanism is not enough to predict their outcome.