Advanced memoization with TTL, LRU, and WeakMap support
npm install @philiprehberger/memo-tsAdvanced memoization with TTL, LRU, and WeakMap support
npm install @philiprehberger/ts-memo
import { memo, weakMemo } from '@philiprehberger/ts-memo';
const getUser = memo(fetchUser, { ttl: '5m', maxSize: 1000 });
const user = await getUser('123'); // cached
getUser.clear();
const getNodeData = weakMemo((node: HTMLElement) => expensiveCalc(node));
// Auto-GC when node is removed from DOM
Track cache performance with hits, misses, evictions, and current size:
const compute = memo(expensiveFn, { maxSize: 100 });
compute(1); // miss
compute(1); // hit
compute(2); // miss
const { hits, misses, evictions, size } = compute.stats();
// { hits: 1, misses: 2, evictions: 0, size: 2 }
Skip caching based on the result value using a shouldCache predicate:
const fetchData = memo(apiFetch, {
shouldCache: (result) => result.status === 'ok',
});
// Only successful responses are cached; errors pass through uncached
const data = await fetchData('/endpoint');
Rejected promises are automatically evicted so the next call retries:
const loadConfig = memo(fetchConfig, { ttl: '10m' });
// If fetchConfig rejects, the failed promise is removed from cache
// so subsequent calls retry instead of returning the rejected promise
const config = await loadConfig();
When maxSize is set, the least-recently-used entry is evicted to make room:
const cached = memo(expensiveFn, { maxSize: 3 });
cached('a'); // cache: [a]
cached('b'); // cache: [b, a]
cached('c'); // cache: [c, b, a]
cached('a'); // hit, promotes a -> cache: [a, c, b]
cached('d'); // evicts 'b' (LRU) -> cache: [d, a, c]
const cached = memo(expensiveFn, { maxSize: 3 });
cached('a');
cached('b');
cached.peek('a'); // returns the cached value, but does not promote 'a' or count as a hit
cached.has('b'); // true
| Function | Description |
|---|---|
memo(fn, options?) | Memoize with optional TTL and LRU |
weakMemo(fn) | WeakMap-based memoization |
.clear() | Clear all cached entries |
.delete(...args) | Remove specific cache entry |
.peek(...args) | Read a cached value without affecting LRU recency or stats |
.has(...args) | Check whether a value is cached without affecting LRU recency |
.stats() | Return { hits, misses, evictions, size } |
.size | Number of cached entries |
| Option | Type | Description |
|---|---|---|
ttl | number | string | Time-to-live (5000, '5m', '1h') |
maxSize | number | Max entries before LRU eviction |
key | (...args) => string | Custom cache key function |
shouldCache | (result) => boolean | Predicate to conditionally skip caching |
npm install
npm run build
npm test
If you find this project useful: