A TypeScript library for managing data fetching and caching from remote sources. Handles the complexity of lazy loading, request deduplication, caching, and background refresh - regardless of how many consumers request the same data.
When building applications that fetch data from APIs, databases, or other remote sources, you often face these challenges:
- Multiple consumers requesting the same data - leads to duplicate network calls
- Managing cache state - knowing when data is fresh, stale, or being fetched
- Graceful refresh - updating data without blocking consumers (stale-while-revalidate)
- Request batching - combining many individual lookups into efficient batch requests
- Synchronous access to cached data - avoiding unnecessary async overhead for cached values
This library provides three utilities that solve these problems with a clean, type-safe API.
npm i projected- Sync returns when cached - all methods return
T | Promise<T>, avoiding Promise overhead for cached values - Request deduplication - multiple consumers share the same in-flight request
- Background refresh -
refresh()returns a promise for fresh data (or error), keeps stale value in cache on error - Request batching -
ProjectedLazyMapbatches individual lookups into single batch requests - Pluggable cache - use built-in
Map, LRU cache, or any custom implementation - Deep freeze protection - optionally freeze returned objects to prevent accidental mutations
| Class | Use Case | Fetches |
|---|---|---|
ProjectedValue<V> |
Single value (e.g., config, user session) | Once, on first access |
ProjectedMap<K, V> |
Small collections (e.g., categories, countries) | All items at once |
ProjectedLazyMap<K, V> |
Large collections (e.g., users, products) | Only requested items, batched |
Caches a single value fetched from a remote source.
import { ProjectedValue } from 'projected';
const config = new ProjectedValue({
value: async () => {
const response = await fetch('/api/config');
return response.json();
},
});
// first call - fetches from remote
const result1 = await config.get();
// second call - returns T (not Promise), but await still works
const result2 = await config.get();| Method | Return Type | Description |
|---|---|---|
get() |
T | Promise<T> |
Returns cached value (sync) or fetches and caches (async) |
refresh() |
Promise<T> |
Fetches fresh value, updates cache, rejects on error |
clear() |
void |
Clears cache, next get() will fetch fresh |
Methods return T | Promise<T>. You can always use await - it works on both:
// always works, regardless of cache state
const value = await config.get();For performance-critical code, check with instanceof Promise to avoid async overhead:
const result = config.get();
if (result instanceof Promise) {
// first call or after clear() - need to await
const value = await result;
} else {
// cached - use directly, no async overhead
console.log(result);
}Caches an entire collection, fetched all at once. Unlike ProjectedValue, it allows accessing individual items by key. Unlike ProjectedLazyMap, it provides access to all items at once. Best for small, frequently-accessed collections where you need both.
import { ProjectedMap } from 'projected';
type Country = { code: string; name: string };
const countries = new ProjectedMap<string, Country>({
key: (country) => country.code,
values: async () => {
const response = await fetch('/api/countries');
return response.json();
},
});
// fetches all countries, caches them
await countries.getByKey('US'); // { code: 'US', name: 'United States' }
// all subsequent calls return T (not Promise) - can still use await
await countries.getByKey('DE');
await countries.getByKeys(['FR', 'IT']);
await countries.getAll();
await countries.getAllAsMap();| Method | Return Type | Description |
|---|---|---|
getByKey(key) |
T | undefined | Promise<...> |
Get single item by key |
getByKeys(keys) |
T[] | Promise<T[]> |
Get multiple items (skips missing) |
getByKeysSparse(keys) |
(T | undefined)[] | Promise<...> |
Get multiple items (keeps order, undefined for missing) |
getAll() |
T[] | Promise<T[]> |
Get all items as array |
getAllAsMap() |
Map<K, T> | Promise<...> |
Get all items as Map |
get(keyOrKeys) |
mixed | Shorthand for getByKey or getByKeys |
refresh() |
Promise<Map<K, T>> |
Fetches fresh map, updates cache, rejects on error |
clear() |
void |
Clears cache |
Fetches items on-demand with automatic request batching. Best for large collections where you only need specific items.
import { ProjectedLazyMap } from 'projected';
type User = { id: string; name: string };
const users = new ProjectedLazyMap<string, User>({
key: (user) => user.id,
values: async (ids) => {
// called with batched ids, e.g., ['user1', 'user2', 'user3']
const response = await fetch(`/api/users?ids=${ids.join(',')}`);
return response.json();
},
delay: 50, // batch requests within 50ms window (default)
maxChunkSize: 1000, // max items per batch (default)
cache: true, // use built-in Map cache (default)
});
// these three calls within 50ms get batched into one request
const [user1, user2, user3] = await Promise.all([
users.getByKey('user1'),
users.getByKey('user2'),
users.getByKey('user3'),
]);
// subsequent calls for cached users return T (not Promise)
await users.getByKey('user1');When multiple getByKey() calls happen within the delay window, they're combined into a single values() call:
// all these calls within 50ms...
users.getByKey('a');
users.getByKey('b');
users.getByKey('c');
// ...result in one values() call with ['a', 'b', 'c']Use any cache implementing ProjectedMapCache interface (compatible with Map, lru-cache, etc.):
import { LRUCache } from 'lru-cache';
const users = new ProjectedLazyMap<string, User>({
key: (user) => user.id,
values: async (ids) => fetchUsers(ids),
cache: new LRUCache({ max: 1000 }),
});Disable caching:
const users = new ProjectedLazyMap<string, User>({
key: (user) => user.id,
values: async (ids) => fetchUsers(ids),
cache: false,
});| Method | Return Type | Description |
|---|---|---|
getByKey(key) |
T | undefined | Promise<...> |
Get single item (sync if cached) |
getByKeys(keys) |
T[] | Promise<T[]> |
Get multiple items (sync if all cached) |
getByKeysSparse(keys) |
(T | undefined)[] | Promise<...> |
Get multiple items preserving order |
get(keyOrKeys) |
mixed | Shorthand for getByKey or getByKeys |
refresh(keyOrKeys) |
Promise<T | undefined | (T | undefined)[]> |
Fetches fresh value(s), updates cache on success |
delete(keyOrKeys) |
void |
Removes item(s) from cache |
clear() |
void |
Clears entire cache |
All three classes support these options:
| Option | Type | Default | Description |
|---|---|---|---|
protection |
'freeze' | 'none' |
'none' |
Deep freeze returned objects |
cache |
boolean | ProjectedMapCache |
true |
Enable/disable or provide custom cache |
Enable protection: 'freeze' to prevent accidental mutations:
const config = new ProjectedValue({
value: async () => ({ setting: 'value' }),
protection: 'freeze',
});
const result = await config.get();
result.setting = 'new'; // throws TypeError in strict modeAll classes implement refresh() for cache updates with error visibility:
// triggers fetch and returns promise
const freshValue = await users.refresh('user1');
// or fire-and-forget with error handling
users.refresh('user1').catch((err) => logger.error('refresh failed', err));
// stale-while-revalidate: get cached value, then refresh in background
const stale = users.getByKey('user1'); // sync if cached
users.refresh('user1').catch(handleError); // background refreshKey behaviors:
- Returns a Promise that resolves to the fresh value
- On error: rejects the promise, but keeps stale value in cache
- Multiple
refresh()calls during a fetch share the same promise
For server applications that prefetch data at startup, you can guarantee sync access by:
- Prefetching data during initialization
- Using
clear()+ background refetch instead of blocking refresh - Asserting sync returns in your getters
const map = new ProjectedMap<string, Category>({
protection: 'freeze',
key: (v) => v.id,
values: () => fetchCategories(),
});
// on startup - prefetch and wait
export async function init() {
await map.getAll();
// subscribe to changes - clear triggers lazy refetch on next access
// or use refresh() for immediate background refetch
dataChanges$.subscribe(() => map.clear());
}
// after init, these always return sync - use assertion for pure return types
export function getAllCategories(): Category[] {
const result = map.getAll();
if (result instanceof Promise) {
throw new Error('Categories not initialized');
}
return result;
}
export function getCategory(id: string): Category | undefined {
const result = map.getByKey(id);
if (result instanceof Promise) {
throw new Error('Categories not initialized');
}
return result;
}This gives you pure sync return types with no MaybePromise wrapper, while still benefiting from the caching and refresh infrastructure.
Full type inference is supported:
const map = new ProjectedMap({
key: (item: { id: string }) => item.id,
values: async () => [{ id: '1', name: 'test' }],
});
// result is { id: string; name: string } | undefined
const result = await map.getByKey('1');