Overview
The caching layer provides intelligent performance optimization for expensive operations in the documentation generation pipeline. It significantly improves generation speed by caching type analysis results and API resolution operations.Performance Improvements:
- Type Analysis: 30-50% faster for complex type parsing
- API Resolution: 20-40% faster for cross-reference resolution
- Overall: Significant speedup for large codebases with repetitive patterns
Architecture
Cache Types
TypeAnalysisCache
Purpose: Caches expensive TypeScript type parsing operationsLocation:
src/cache/TypeAnalysisCache.tsCaches: Complex type string parsing, object property extraction, union/intersection analysisKey Benefit: Eliminates redundant parsing of identical type strings across multiple API itemsApiResolutionCache
Purpose: Caches API model cross-reference resolutionLocation:
src/cache/ApiResolutionCache.tsCaches: Declaration reference resolution, symbol lookups, cross-file referencesKey Benefit: Prevents repeated expensive API model traversals for the same referencesCacheManager
Purpose: Centralized coordination of all caching operationsLocation:
src/cache/CacheManager.tsManages: Multiple cache instances, statistics collection, global enable/disableKey Benefit: Unified interface for cache operations and performance monitoringTypeAnalysisCache
LRU Cache Implementation
LRU Cache Implementation
Uses a Least Recently Used (LRU) eviction strategy:
- Default Size: 1000 cached items
- Eviction: Removes oldest items when cache is full
- Thread-Safe: Safe for concurrent access
- Hit Tracking: Monitors cache effectiveness
Cache Key Generation
Cache Key Generation
Creates deterministic cache keys from type strings:Examples:
"string"→"string""Promise<User>"→"Promise<User>""{ name: string; age: number }"→"{ name: string; age: number }"
Cached Function Wrapper
Cached Function Wrapper
Provides utility for caching any type analysis function:Usage:
ApiResolutionCache
Declaration Reference Caching
Declaration Reference Caching
Caches API model resolution operations:Cache Key Strategy:
Cached Resolver Creation
Cached Resolver Creation
Creates cached wrappers for API resolution functions:Integration:
CacheManager
Centralized Cache Coordination
Centralized Cache Coordination
Manages multiple cache instances with unified interface:Key Features:
- Global enable/disable for all caches
- Individual cache configuration
- Unified statistics collection
- Environment-specific presets
Environment-Specific Configurations
Environment-Specific Configurations
Provides optimized configurations for different environments:Rationale:
- Development: Smaller caches, statistics enabled for debugging
- Production: Larger caches, statistics disabled for performance
Statistics and Monitoring
Statistics and Monitoring
Comprehensive cache performance monitoring:Output Example:
Integration Points
1
ObjectTypeAnalyzer Integration
ObjectTypeAnalyzer automatically uses TypeAnalysisCache:2
CustomMarkdownEmitter Integration
CustomMarkdownEmitter uses ApiResolutionCache for link resolution:3
MarkdownDocumenter Integration
MarkdownDocumenter coordinates all caching through CacheManager:Performance Characteristics
Cache Hit Rates
Typical Performance:
- Type Analysis: 40-60% hit rate
- API Resolution: 30-50% hit rate
- Overall: Significant speedup for large codebases
- Codebase size and complexity
- Repetitive type patterns
- Cross-reference density
Memory Usage
Configurable Limits:
- TypeAnalysisCache: Default 1000 items
- ApiResolutionCache: Default 500 items
- Memory-efficient LRU eviction
- Environment-specific presets
- Adjustable cache sizes
- Automatic cleanup
Speed Improvements
Measured Benefits:
- Type Analysis: 30-50% faster
- API Resolution: 20-40% faster
- Large Projects: Up to 2x improvement
- Complex generic types
- Repeated interface definitions
- Dense cross-references
Best Practices
Cache Size Tuning
Cache Size Tuning
Development:
- Smaller caches (500-1000 items)
- Enable statistics for monitoring
- Focus on debugging cache behavior
- Larger caches (1000-2000 items)
- Disable statistics for performance
- Monitor hit rates in logs
- Increase TypeAnalysisCache size
- Consider separate cache instances
- Monitor memory usage
When to Disable Caching
When to Disable Caching
Disable for:
- Testing cache behavior
- Debugging type analysis issues
- Memory-constrained environments
- Very small projects (minimal benefit)
Monitoring Performance
Monitoring Performance
Enable Statistics:Interpret Results:
- Hit Rate > 40%: Good cache effectiveness
- Hit Rate < 20%: Consider tuning or disabling
- Memory Usage: Monitor cache sizes vs. benefit
Troubleshooting
Low Hit Rates
Low Hit Rates
Symptoms: Cache statistics show < 20% hit rateCauses:
- Too many unique type patterns
- Insufficient repetitive structures
- Cache size too small
- Increase cache size
- Analyze type diversity
- Consider disabling for small projects
Memory Issues
Memory Issues
Symptoms: High memory usage during generationCauses:
- Cache sizes too large
- Retaining too many cached items
- Memory leaks in cache implementation
- Reduce cache sizes
- Enable LRU eviction
- Monitor memory usage patterns
Cache Corruption
Cache Corruption
Symptoms: Incorrect type analysis or resolution resultsCauses:
- Hash collisions in cache keys
- Stale cache data
- Race conditions (rare)
- Clear caches between runs
- Check cache key generation
- Validate cached data integrity

