Skip to content

ilovepixelart/ts-cache-mongoose

ts-cache-mongoose

Cache query and aggregate in mongoose using in-memory or redis

npm npm GitHub
Coverage Quality Gate Status
Reliability Rating Maintainability Rating Security Rating
Socket Badge OpenSSF Scorecard OpenSSF Best Practices

Motivation

ts-cache-mongoose is a plugin for mongoose
I need a way to cache mongoose queries and aggregations to improve application performance. It should support both in-memory and Redis cache engines, work with all major Node.js frameworks, and be easy to use with a simple .cache() method on queries and aggregations.

Supports and tested with

{
  "node": "20.x || 22.x || 24.x",
  "mongoose": ">=6.6.0 <10"
}

CI tests against mongoose 6.12.2, 7.6.4, 8.23.0, and 9.4.1.

Features

  • In-memory caching
  • Redis caching
  • Cache expiration
  • Cache invalidation
  • Cache key generation
  • Cache key prefix
  • Query caching
  • Aggregate caching
  • Supports ESM and CommonJS

Installation

mongoose is a peer dependency — install it alongside ts-cache-mongoose.

npm install ts-cache-mongoose mongoose
pnpm add ts-cache-mongoose mongoose
yarn add ts-cache-mongoose mongoose
bun add ts-cache-mongoose mongoose

Example

Works with any Node.js framework — Express, Fastify, Koa, Hono, etc:

import mongoose from 'mongoose'
import cache from 'ts-cache-mongoose'

// In-memory
cache.init(mongoose, {
  defaultTTL: '60 seconds',
  engine: 'memory',
})

// Or Redis
cache.init(mongoose, {
  defaultTTL: '60 seconds',
  engine: 'redis',
  engineOptions: {
    host: 'localhost',
    port: 6379,
  },
})

mongoose.connect('mongodb://localhost:27017/my-database')

Query caching

const users = await User.find({ role: 'user' }).cache('10 seconds').exec()
const book = await Book.findById(id).cache('1 hour').exec()
const count = await Book.countDocuments().cache('1 minute').exec()
const authors = await Book.distinct('author').cache('30 seconds').exec()

Aggregate caching

const books = await Book.aggregate([
  { $match: { genre: 'fantasy' } },
  { $group: { _id: '$author', count: { $sum: 1 } } },
]).cache('1 minute').exec()

Bounded in-memory cache

The in-memory engine is unbounded by default. For workloads where query keys are driven by user input (search, filters, pagination), cap the cache so a caller generating unique cache keys cannot grow the map without limit. Two bounds are available and can be combined — eviction is LRU, and whichever bound is hit first triggers it:

cache.init(mongoose, {
  engine: 'memory',
  defaultTTL: '60 seconds',
  maxEntries: 10_000,          // cap by entry count
  maxBytes: 50 * 1024 * 1024,  // cap by serialized bytes (50 MB)
})

maxBytes measures entry size via node:v8.serialize(value).byteLength by default — handles circular references (mongoose populate parent-refs), single C++ call per set, works on Node / Bun / Deno. Provide your own sizeCalculation callback if you want an O(1) estimate instead:

cache.init(mongoose, {
  engine: 'memory',
  maxBytes: 50 * 1024 * 1024,
  sizeCalculation: (value) => {
    if (Array.isArray(value)) return value.length * 512
    return 512
  },
})

Eviction is soft: the just-written entry is never dropped, even if its own size exceeds maxBytes. Everything older gets evicted until both bounds are satisfied (or only the new entry remains).

Both options are ignored for the Redis engine — use Redis's own maxmemory + maxmemory-policy instead.

Custom error handling

By default, cache engine failures (Redis disconnects, serialization errors, etc.) are logged via console.error and the query falls through to the database. Pass an onError callback to route them somewhere else — e.g. a structured logger, Sentry, or a metric counter:

cache.init(mongoose, {
  engine: 'redis',
  defaultTTL: '60 seconds',
  engineOptions: { host: 'localhost', port: 6379 },
  onError: (error) => {
    logger.warn({ err: error }, 'cache engine failure')
  },
})

The callback receives the raw Error. Cache reads and writes never throw — a failing engine degrades to a cache miss.

Cache invalidation

const instance = cache.init(mongoose, { engine: 'memory', defaultTTL: '60 seconds' })

// Clear all cache
await instance.clear()

// Or use custom cache key
const user = await User.findById(id).cache('1 minute', 'user-key').exec()
await instance.clear('user-key')

NestJS (because it's special)

Import CacheModule from ts-cache-mongoose/nest:

import { CacheModule } from 'ts-cache-mongoose/nest'

@Module({
  imports: [
    MongooseModule.forRoot(process.env.MONGO_URI),
    CacheModule.forRoot({
      engine: 'memory',
      defaultTTL: '60 seconds',
    }),
  ],
})
export class AppModule {}

With ConfigService:

CacheModule.forRootAsync({
  inject: [ConfigService],
  useFactory: (config: ConfigService) => ({
    engine: config.get('CACHE_ENGINE', 'memory'),
    defaultTTL: config.get('CACHE_TTL', '60 seconds'),
  }),
})

Inject CacheService for programmatic cache clearing:

import { CacheService } from 'ts-cache-mongoose/nest'

@Injectable()
export class SomeService {
  constructor(private readonly cacheService: CacheService) {}

  async clearUserCache() {
    await this.cacheService.clear('user-cache-key')
  }
}

Contributing

Check CONTRIBUTING.md

License

This project is licensed under the MIT License - see the LICENSE file for details

Check my other projects