Module Caching
Kempo Server includes an intelligent module caching system that dramatically improves performance by caching JavaScript route modules in memory.
How It Works
The cache system combines multiple strategies to optimize performance while preventing memory issues:
- LRU (Least Recently Used) - Evicts oldest modules when cache fills
- Time-based expiration - Modules expire after configurable TTL
- Memory monitoring - Automatically clears cache if memory usage gets too high
- File watching - Instantly invalidates cache when files change (development)
Basic Configuration
Add a cache object to your .config.json file:
{
"cache": {
"enabled": true,
"maxSize": 100,
"maxMemoryMB": 50,
"ttlMs": 300000,
"watchFiles": true
}
}
Configuration Options
| Option | Type | Default | Description |
|---|---|---|---|
enabled |
boolean | true |
Enable/disable the entire caching system |
maxSize |
number | 100 |
Maximum number of modules to cache |
maxMemoryMB |
number | 50 |
Memory limit for cache in megabytes |
ttlMs |
number | 300000 |
Time to live in milliseconds (5 minutes) |
maxHeapUsagePercent |
number | 70 |
Clear cache when heap usage exceeds this percentage |
memoryCheckInterval |
number | 30000 |
How often to check memory usage (milliseconds) |
watchFiles |
boolean | true |
Auto-invalidate cache when files change |
enableMemoryMonitoring |
boolean | true |
Enable automatic memory monitoring |
Environment-Specific Configuration
Use separate configuration files for different environments instead of nested objects:
Development Configuration
Create dev.config.json with settings optimized for development:
{
"cache": {
"enabled": true,
"maxSize": 50,
"maxMemoryMB": 25,
"ttlMs": 300000,
"watchFiles": true
}
}
node src/index.js --config dev.config.json
Production Configuration
Create prod.config.json with settings optimized for production:
{
"cache": {
"enabled": true,
"maxSize": 1000,
"maxMemoryMB": 200,
"ttlMs": 3600000,
"watchFiles": false
}
}
node src/index.js --config prod.config.json
Cache Monitoring
Monitor cache performance and manage cached modules at runtime:
View Cache Statistics
curl http://localhost:3000/_admin/cache
Returns detailed statistics including:
- Cache size and memory usage
- Hit/miss rates
- Node.js memory usage
- List of cached files
Clear Cache
curl -X DELETE http://localhost:3000/_admin/cache
Immediately clears all cached modules.
Performance Tuning
Memory Settings
- Set
maxMemoryMBto 10-20% of available server RAM - Use lower
maxHeapUsagePercent(60-70%) for memory-constrained environments - Monitor actual usage with
/_admin/cacheendpoint
Cache Size
- Start with
maxSize= 10x your typical concurrent routes - Increase if you have many route files and sufficient memory
- Decrease for microservices with few routes
TTL Settings
- Development: Short TTL (5-10 minutes) for quick iteration
- Production: Longer TTL (30-60 minutes) for stability
- High-change environments: Shorter TTL or rely on file watching
File Watching
- Enable (
watchFiles: true) in development for instant invalidation - Disable (
watchFiles: false) in production to reduce overhead
Configuration Examples
Minimal Configuration
Just enable caching with defaults:
{
"cache": {
"enabled": true
}
}
High-Traffic Configuration
For servers with lots of routes and traffic:
{
"cache": {
"enabled": true,
"maxSize": 2000,
"maxMemoryMB": 500,
"ttlMs": 7200000,
"watchFiles": false
}
}
Memory-Constrained Configuration
For servers with limited RAM:
{
"cache": {
"enabled": true,
"maxSize": 25,
"maxMemoryMB": 10,
"ttlMs": 120000,
"maxHeapUsagePercent": 60
}
}
Troubleshooting
Cache Not Working
- Verify
cache.enabledistrue - Check file permissions for watching
- Review server logs for cache statistics
Memory Issues
- Reduce
maxMemoryMBandmaxSize - Lower
maxHeapUsagePercent - Enable more aggressive memory monitoring
Performance Issues
- Increase cache limits if you have available memory
- Extend
ttlMsfor stable route files - Monitor hit rates with admin endpoints