FastCache

April 6, 2026 · View on GitHub

7x-10x faster alternative to MemoryCache. A high-performance, lighweight (8KB dll) and thread-safe memory cache for .NET Core (.NET 6 and later)

NuGet version .NET

TL;DR

Basically it's just a ConcurrentDictionary with expiration.

Benchmarks

Windows:

MethodMeanErrorStdDevGen0Allocated
DictionaryLookup65.38 ns1.594 ns0.087 ns--
FastCacheLookup67.15 ns2.582 ns0.142 ns--
MemoryCacheLookup426.60 ns60.162 ns3.298 ns0.0200128 B
FastCacheGetOrAdd44.31 ns1.170 ns0.064 ns--
MemoryCacheGetOrAdd826.85 ns36.609 ns2.007 ns0.18791184 B
FastCacheAddRemove99.97 ns12.040 ns0.660 ns0.006380 B
MemoryCacheAddRemove710.70 ns32.415 ns1.777 ns0.0515328 B

Linux (Ubuntu, Docker):

MethodMeanErrorStdDevGen0Allocated
FastCacheLookup94.97 ns3.250 ns0.178 ns--
MemoryCacheLookup1,051.69 ns64.904 ns3.558 ns0.0191128 B
FastCacheAddRemove148.32 ns25.766 ns1.412 ns0.007680 B
MemoryCacheAddRemove1,120.75 ns767.666 ns42.078 ns0.0515328 B

How is FastCache better

Compared to System.Runtime.Caching.MemoryCache and Microsoft.Extensions.Caching.MemoryCache FastCache is

  • 7X faster reads (11X under Linux!)
  • 10x faster writes
  • Thread safe and atomic
  • Generic (strongly typed keys and values) to avoid boxing/unboxing primitive types
  • MemoryCache uses string keys only, so it allocates strings for keying
  • MemoryCache comes with performance counters that can't be turned off
  • MemoryCache uses heuristic and black magic to evict keys under memory pressure
  • MemoryCache uses more memory, can crash during a key scan

Usage

Install via nuget

Install-Package Jitbit.FastCache

Then use

var cache = new FastCache<string, int>();

cache.AddOrUpdate(
	key: "answer",
	value: 42,
	ttl: TimeSpan.FromMinutes(1));

cache.TryGet("answer", out int value); //value is "42"

//factory pattern! calls the expensive factory only if not cached yet
cache.GetOrAdd(
	key: "answer",
	valueFactory: k => 42,
	ttl: TimeSpan.FromMilliseconds(100));

//handy overload to prevent captures/closures allocation
cache.GetOrAdd(
	key: "answer",
	valueFactory: (k, arg) => 42 + arg.Length,
	ttl: TimeSpan.FromMilliseconds(100),
	factoryArgument: "some state data");

Tradeoffs

FastCache uses Environment.TickCount to monitor items' TTL. Environment.TickCount is 104x times faster than using DateTime.Now and 26x times faster than DateTime.UtcNow.

But Environment.TickCount is limited to Int32. Which means it resets to int.MinValue once overflowed. This is not a problem, we do have a workaround for that. However this means you cannot cache stuff for more than 25 days (2.4 billion milliseconds).

The above is no longer valid, we have switched to .NET 6 targeting and now use TickCount64 which is free of this problem.

Another tradeoff: MemoryCache watches memory usage, and evicts items once it senses memory pressure. FastCache does not do any of that it is up to you to keep your caches reasonably sized. After all, it's just a dictionary.

API Reference

FastCache<TKey, TValue>

Implements IEnumerable<KeyValuePair<TKey, TValue>>, IDisposable.

Constructor

FastCache(int cleanupJobInterval = 10000, EvictionCallback itemEvicted = null)

Creates a new empty cache instance.

ParameterTypeDefaultDescription
cleanupJobIntervalint10000Background cleanup interval in milliseconds
itemEvictedEvictionCallbacknullOptional callback when an item is evicted (runs on thread pool)

Methods

AddOrUpdate(TKey key, TValue value, TimeSpan ttl)

Adds an item to cache or updates it if it already exists. Updating resets the TTL (sliding expiration).

AddOrUpdate(TKey key, Func<TKey, TValue> addValueFactory, Func<TKey, TValue, TValue> updateValueFactory, TimeSpan ttl)

Factory overload. Uses addValueFactory when the key is new, updateValueFactory when it exists.

TryGet(TKey key, out TValue value)bool

Attempts to get a value by key. Returns true if found and not expired.

TryAdd(TKey key, TValue value, TimeSpan ttl)bool

Attempts to add a key/value item. Returns false if the key already exists (and is not expired).

GetOrAdd(TKey key, Func<TKey, TValue> valueFactory, TimeSpan ttl)TValue

Returns existing value if cached, otherwise calls the factory to create, cache, and return it.

GetOrAdd(TKey key, TValue value, TimeSpan ttl)TValue

Returns existing value if cached, otherwise adds the provided value and returns it.

GetOrAdd<TArg>(TKey key, Func<TKey, TArg, TValue> valueFactory, TimeSpan ttl, TArg factoryArgument)TValue

Same as GetOrAdd but accepts a factoryArgument to avoid closure allocations.

Touch(TKey key, TimeSpan ttl)

Resets the TTL for an existing (non-expired) item — sliding expiration.

Remove(TKey key)

Removes the item with the specified key.

TryRemove(TKey key, out TValue value)bool

Removes the item and returns the removed value. Returns false if not found or expired.

EvictExpired()

Manually triggers cleanup of expired items. Rarely needed since TryGet checks TTL anyway.

Properties

PropertyTypeDescription
CountintTotal item count, including expired items not yet cleaned up

Clear()

Removes all items from the cache.

Delegate

delegate void EvictionCallback(TKey key, TValue value)

Callback invoked (on thread pool) when an item is evicted from the cache.