Thanks to visit codestin.com
Credit goes to github.com

Skip to content

🟑 High: Add memory limits to query cache system (unbounded growth) #13

@jas88

Description

@jas88

🟑 High: Add memory limits to query cache system (unbounded growth)

Issue Summary

RDMP's query cache system has unbounded growth with no memory limits, eviction policies, or size management, causing memory leaks and performance degradation over time.

🚨 High Impact

  • Memory Leaks: Cache grows indefinitely without cleanup
  • Performance Degradation: Cache lookup performance degrades with size
  • OutOfMemory Risk: Long-running applications eventually crash
  • Resource Waste: Unused cached data consumes valuable memory

πŸ” Current Problems

1. Unbounded DataTable Caching

// CachedAggregateConfigurationResultsManager.cs - Lines 171-185
private readonly Dictionary<string, DataTable> _cache = new();

public void CacheResult(string key, DataTable result)
{
    _cache[key] = result; // Never expires!
    // No size limits, no eviction, no monitoring
}

Problem: DataTables cached indefinitely, can be gigabytes in size

2. RowVerCache with Global Lock

// RowVerCache.cs
private static readonly ConcurrentDictionary<string, Type> _types = null;
private readonly object _oLockCachedObjects = new object();

public List<T> GetAllObjects()
{
    if (Broken || !Monitor.TryEnter(_oLockCachedObjects))
        return _repository.GetAllObjectsNoCache<T>().ToList();

    // Falls back to full table scan but keeps growing without limits
    _cachedObjects.AddRange(_repository.GetAllObjectsNoCache<T>());
}

Problem: Single global lock and unlimited cache growth

3. No Eviction Strategy

// Multiple locations cache without removal:
static readonly Dictionary<Type, Type[]> TypeCache = new();
static readonly Dictionary<string, Exception> badAssemblies = new();

Problem: Caches grow until application restart

4. No Memory Pressure Handling

// No monitoring of memory usage
// No cleanup when memory is low
// No size-based eviction

πŸ“Š Memory Growth Analysis

Runtime Cache Size Memory Usage Performance Impact
1 hour ~50MB Acceptable βœ… Good
4 hours ~200MB Moderate ⚠️ Slower
8 hours ~800MB High ❌ Poor
24 hours ~5GB+ Critical ❌ Unusable

πŸ›  Recommended Solution

1. Memory-Limited Cache with LRU Eviction

public class MemoryLimitedCache<TKey, TValue> where TValue : class
{
    private readonly int _maxMemoryBytes;
    private readonly ConcurrentDictionary<TKey, CacheItem> _cache = new();
    private long _currentMemoryUsage = 0;
    private readonly object _evictionLock = new object();

    public bool TryAdd(TKey key, TValue value)
    {
        var estimatedSize = EstimateSize(value);

        // Check if we need to evict
        if (_currentMemoryUsage + estimatedSize > _maxMemoryBytes)
        {
            EvictLeastRecentlyUsed(estimatedSize);
        }

        var item = new CacheItem(value, estimatedSize, DateTime.UtcNow);
        if (_cache.TryAdd(key, item))
        {
            Interlocked.Add(ref _currentMemoryUsage, estimatedSize);
            return true;
        }
        return false;
    }

    private void EvictLeastRecentlyUsed(long spaceNeeded)
    {
        lock (_evictionLock)
        {
            var itemsToEvict = _cache.Values
                .OrderBy(x => x.LastAccessed)
                .TakeWhile(x => _currentMemoryUsage - x.Size + spaceNeeded > _maxMemoryBytes)
                .ToList();

            foreach (var item in itemsToEvict)
            {
                if (_cache.TryRemove(item.Key, out _))
                {
                    Interlocked.Add(ref _currentMemoryUsage, -item.Size);
                    // Dispose if it's IDisposable
                    if (item.Value is IDisposable disposable)
                        disposable.Dispose();
                }
            }
        }
    }
}

2. Memory Pressure Monitoring

public class MemoryPressureMonitor
{
    private readonly Timer _monitoringTimer;
    private readonly long _warningThresholdBytes;
    private readonly long _criticalThresholdBytes;

    public MemoryPressureMonitor()
    {
        _warningThresholdBytes = GC.GetGCMemoryInfo().TotalAvailableMemoryBytes * 0.7;
        _criticalThresholdBytes = GC.GetGCMemoryInfo().TotalAvailableMemoryBytes * 0.85;

        _monitoringTimer = new Timer(MonitorMemoryPressure, null, TimeSpan.FromMinutes(5), TimeSpan.FromMinutes(1));
    }

    private void MonitorMemoryPressure(object state)
    {
        var memoryUsage = GC.GetTotalMemory(false);
        var memoryPressure = (double)memoryUsage / GC.GetGCMemoryInfo().TotalAvailableMemoryBytes;

        if (memoryPressure > 0.85) // Critical
        {
            ForceCacheEviction(0.5); // Evict 50% of cache
            GC.Collect();
        }
        else if (memoryPressure > 0.7) // Warning
        {
            ForceCacheEvictor(0.2); // Evict 20% of cache
        }
    }
}

3. Bounded Query Cache

public class BoundedQueryCache
{
    private readonly MemoryLimitedCache<string, DataTable> _dataTableCache;
    private readonly MemoryLimitedCache<string, Type[]> _typeCache;
    private readonly MemoryLimitedCache<string, Exception> _errorCache;

    public BoundedQueryCache(long maxCacheSizeBytes = 100 * 1024 * 1024) // 100MB default
    {
        // Allocate cache sizes proportionally
        var dataTableCacheSize = (long)(maxCacheSizeBytes * 0.6); // 60MB for DataTables
        var typeCacheSize = (long)(maxCacheSizeBytes * 0.3); // 30MB for type arrays
        var errorCacheSize = (long)(maxCacheSizeBytes * 0.1); // 10MB for errors

        _dataTableCache = new MemoryLimitedCache<string, DataTable>(dataTableCacheSize);
        _typeCache = new MemoryLimitedCache<string, Type[]>(typeCacheSize);
        _errorCache = new MemoryLimitedCache<string, Exception>(errorCacheSize);
    }
}

4. Cache Statistics and Monitoring

public class CacheStatistics
{
    public long TotalHits { get; private set; }
    public long TotalMisses { get; private set; }
    public long Evictions { get; private set; }
    public long CurrentMemoryUsage { get; private set; }
    public int CurrentItemCount { get; private set; }

    public double HitRate => TotalHits + TotalMisses > 0 ? (double)TotalHits / (TotalHits + TotalMisses) : 0;

    public void RecordHit() => Interlocked.Increment(ref TotalHits);
    public void RecordMiss() => Interlocked.Increment(ref TotalMisses);
    public void RecordEviction() => Interlocked.Increment(ref Evictions);
}

5. Smart Cache Key Management

public class SmartCacheKey
{
    public string Query { get; }
    public string[] ParameterHashes { get; }
    public DateTime Created { get; }
    public TimeSpan TimeToLive { get; }

    public bool IsExpired => DateTime.UtcNow - Created > TimeToLive;

    public SmartCacheKey(string query, object[] parameters, TimeSpan ttl)
    {
        Query = query;
        ParameterHashes = parameters?.Select(p => p?.GetHashCode().ToString() ?? "null").ToArray() ?? Array.Empty<string>();
        Created = DateTime.UtcNow;
        TimeToLive = ttl;
    }

    public override bool Equals(object obj)
    {
        return obj is SmartCacheKey other &&
               Query == other.Query &&
               ParameterHashes.SequenceEqual(other.ParameterHashes);
    }

    public override int GetHashCode()
    {
        return HashCode.Combine(Query, string.Join(",", ParameterHashes));
    }
}

🎯 Implementation Plan

Phase 1 (Week 1): Core Cache Infrastructure

  • Implement MemoryLimitedCache<TKey, TValue>
  • Add memory pressure monitoring
  • Create cache statistics tracking

Phase 2 (Week 2): Query Cache Replacement

  • Replace unbounded caches with bounded versions
  • Add LRU eviction policies
  • Implement time-based expiration

Phase 3 (Week 3): Integration and Monitoring

  • Update all cache usage throughout codebase
  • Add cache performance monitoring
  • Create cache management dashboard

Phase 4 (Week 4): Performance Optimization

  • Tune cache sizes and eviction policies
  • Add cache warming strategies
  • Implement cache partitioning by data type

βœ… Acceptance Criteria

  • All caches have memory limits and eviction policies
  • Memory usage stays constant over time (no unbounded growth)
  • Cache hit rate >80% for frequently accessed data
  • Automatic cleanup under memory pressure
  • Cache statistics and monitoring available
  • Performance tests confirm stable memory usage

πŸ” Areas Requiring Updates

High Priority:

  1. CachedAggregateConfigurationResultsManager - DataTable caching
  2. RowVerCache - Type and object caching
  3. MEF Type Cache - Assembly type caching
  4. Query Result Caches - General query caching

Medium Priority:

  1. Error Caches - Exception and error caching
  2. Configuration Caches - Settings and config caching
  3. Plugin Caches - Plugin discovery caching
  4. Metadata Caches - Database metadata caching

πŸ“ˆ Expected Impact

  • Memory Stability: Constant memory usage regardless of runtime
  • Performance: Consistent cache hit rates >80%
  • Reliability: No OutOfMemory exceptions from cache growth
  • Monitoring: Real-time cache performance metrics
  • Automatic Cleanup: Cache responds to memory pressure

πŸ”— Related Issues

  • DataTable memory optimization
  • Memory usage analysis
  • Performance monitoring
  • Resource management improvements

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions