Redis Implementation in .NET 10

Muhammad Rizwan
2026-03-11
27 min read
Redis Implementation in .NET 10

Let me paint a picture that will sound familiar if you have worked on any production .NET application. Your API is working perfectly during development. Response times are snappy, everything feels fast, and the team is happy. Then you deploy to production, real users show up, and suddenly your database is drowning under the weight of thousands of identical queries per second. Your product listing page, which seemed harmless, is hammering the database with the same SELECT statement every time someone opens the homepage. Your response times climb from 20 milliseconds to 800 milliseconds. The infrastructure team starts talking about vertical scaling and bigger database instances. The monthly bill starts climbing.

This is exactly the kind of problem Redis was designed to solve. It is not a silver bullet and it is not the right choice for every application, but when your bottleneck is repeated reads against a relational database, Redis can change everything.

In this article, we are going to build a complete Redis implementation in .NET 10. Not a toy example with a single GET and SET. We are going to build the kind of caching infrastructure that actually works in production: cache aside patterns, distributed sessions, Pub/Sub messaging, health checks, and monitoring. Everything will use real C# code that you can drop into your own projects.


Why Redis and Why Now

Before we write a single line of code, it is worth understanding what makes Redis special compared to other caching options.

Redis is an in memory data store that operates entirely in RAM. When your application reads from Redis, the data does not touch a disk. It does not go through a query optimizer. There is no parsing SQL, no building execution plans, no scanning indexes. The data is sitting in memory, addressed by a key, and Redis hands it back in sub millisecond time. We are talking about response times measured in microseconds, not milliseconds.

That speed difference is not a rounding error. In a typical .NET application hitting SQL Server or PostgreSQL, a well optimized query returns in 5 to 50 milliseconds. The same data served from Redis comes back in less than 1 millisecond. When you multiply that difference across thousands of requests per second, the cumulative effect on user experience and infrastructure cost is dramatic.

Redis also gives you data structures that go far beyond simple key value pairs. You get strings, hashes, lists, sets, sorted sets, streams, and more. This means Redis is not just a dumb cache. You can build leaderboards with sorted sets, implement rate limiters with atomic counters, run Pub/Sub messaging between microservices, and manage distributed sessions across a server farm. All from the same Redis instance.

The .NET ecosystem has excellent Redis support through the StackExchange.Redis library, which is maintained by the Stack Overflow team (the same team that runs one of the highest traffic .NET applications on the planet). If Redis works for Stack Overflow at scale, it will work for your application too.

Redis Architecture Overview


Setting Up Redis in a .NET 10 Project

Let us start from the beginning and build this up properly. First, you need a Redis server running somewhere. For local development, the easiest option is Docker.

Free Newsletter

Enjoying the article? Stay in the loop.

  • Production-ready code samples every week
  • In-depth .NET, C# & React tutorials
  • Career tips & dev insights
500+ developers · No spam · Unsubscribe anytime

Join the community

Get new articles delivered every week.

No credit card · No spam · Cancel anytime · Learn more

Running Redis Locally with Docker

bash
docker run --name redis-dev -p 6379:6379 -d redis:latest

This gives you a Redis instance on localhost port 6379. For production, you would use a managed service like Azure Cache for Redis or Amazon ElastiCache, but for development and learning, a local Docker container is perfect.

Installing the NuGet Packages

Your .NET 10 project needs two packages. The first is the core StackExchange.Redis library. The second is the Microsoft caching bridge that integrates Redis with the built in IDistributedCache interface.

bash
dotnet add package StackExchange.Redis dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis

Configuring Redis in Program.cs

The connection to Redis should be configured once at application startup and shared across the entire application through dependency injection. The IConnectionMultiplexer is designed to be a long lived singleton. Creating a new connection for every request is a serious mistake that will exhaust your connection pool and bring Redis to its knees.

csharp
using StackExchange.Redis; var builder = WebApplication.CreateBuilder(args); // Register the Redis connection multiplexer as a singleton builder.Services.AddSingleton<IConnectionMultiplexer>(sp => { var configuration = ConfigurationOptions.Parse( builder.Configuration.GetConnectionString("Redis")!); configuration.AbortOnConnectFail = false; configuration.ConnectRetry = 3; configuration.ConnectTimeout = 5000; configuration.SyncTimeout = 5000; return ConnectionMultiplexer.Connect(configuration); }); // Register the distributed cache backed by Redis builder.Services.AddStackExchangeRedisCache(options => { options.Configuration = builder.Configuration .GetConnectionString("Redis"); options.InstanceName = "MyApp:"; }); var app = builder.Build();

And the corresponding configuration in appsettings.json:

json
{ "ConnectionStrings": { "Redis": "localhost:6379,abortConnect=false,connectRetry=3" } }

A few things to notice here. The AbortOnConnectFail is set to false, which means the application will not crash if Redis is temporarily unavailable at startup. Instead, it will retry the connection in the background. The InstanceName prefix ensures that if multiple applications share the same Redis server, their keys do not collide. Every key your application writes will be automatically prefixed with "MyApp:" so you get clean namespacing for free.


Building the Cache Service Layer

Injecting IConnectionMultiplexer directly into your controllers and services is possible, but it is not a good idea. The same way you would not scatter raw SQL queries throughout your codebase, you should not scatter raw Redis commands everywhere. Let us build a proper cache service abstraction.

The Cache Service Interface

csharp
public interface ICacheService { Task<T?> GetAsync<T>(string key, CancellationToken cancellationToken = default); Task SetAsync<T>(string key, T value, TimeSpan? expiry = null, CancellationToken cancellationToken = default); Task<T> GetOrSetAsync<T>(string key, Func<Task<T>> factory, TimeSpan? expiry = null, CancellationToken cancellationToken = default); Task RemoveAsync(string key, CancellationToken cancellationToken = default); Task RemoveByPrefixAsync(string prefix, CancellationToken cancellationToken = default); Task<bool> ExistsAsync(string key, CancellationToken cancellationToken = default); }

This interface gives you everything you need for day to day caching. The GetOrSetAsync method is particularly important because it implements the cache aside pattern in a single call: check the cache, return if found, otherwise execute the factory function, cache the result, and return it. This pattern eliminates an entire class of bugs where developers forget to populate the cache after a miss.

The Redis Cache Service Implementation

csharp
using System.Text.Json; using StackExchange.Redis; public class RedisCacheService : ICacheService { private readonly IConnectionMultiplexer _redis; private readonly IDatabase _database; private readonly JsonSerializerOptions _jsonOptions; public RedisCacheService(IConnectionMultiplexer redis) { _redis = redis; _database = redis.GetDatabase(); _jsonOptions = new JsonSerializerOptions { PropertyNamingPolicy = JsonNamingPolicy.CamelCase, WriteIndented = false }; } public async Task<T?> GetAsync<T>(string key, CancellationToken cancellationToken = default) { var value = await _database.StringGetAsync(key); if (value.IsNullOrEmpty) return default; return JsonSerializer.Deserialize<T>(value!, _jsonOptions); } public async Task SetAsync<T>(string key, T value, TimeSpan? expiry = null, CancellationToken cancellationToken = default) { var serialized = JsonSerializer.Serialize(value, _jsonOptions); await _database.StringSetAsync(key, serialized, expiry); } public async Task<T> GetOrSetAsync<T>(string key, Func<Task<T>> factory, TimeSpan? expiry = null, CancellationToken cancellationToken = default) { var cached = await GetAsync<T>(key, cancellationToken); if (cached is not null) return cached; var value = await factory(); await SetAsync(key, value, expiry, cancellationToken); return value; } public async Task RemoveAsync(string key, CancellationToken cancellationToken = default) { await _database.KeyDeleteAsync(key); } public async Task RemoveByPrefixAsync(string prefix, CancellationToken cancellationToken = default) { var endpoints = _redis.GetEndPoints(); var server = _redis.GetServer(endpoints[0]); var keys = server.Keys(pattern: $"{prefix}*").ToArray(); if (keys.Length > 0) { await _database.KeyDeleteAsync(keys); } } public async Task<bool> ExistsAsync(string key, CancellationToken cancellationToken = default) { return await _database.KeyExistsAsync(key); } }

Register this in your DI container:

csharp
builder.Services.AddSingleton<ICacheService, RedisCacheService>();

Notice that the cache service is registered as a singleton. Since IConnectionMultiplexer is thread safe and designed to be shared, the cache service can safely be shared across all requests without any concurrency issues.


Understanding Caching Strategies

Not all caching is the same. The strategy you choose has a significant impact on data consistency, performance, and complexity. Let us break down the three main approaches.

Caching Strategies Compared

Cache Aside (Lazy Loading)

This is the most common caching pattern and the one you should reach for first. The application checks the cache before querying the database. If the data is in the cache (a cache hit), it returns immediately. If the data is not in the cache (a cache miss), the application queries the database, stores the result in the cache, and then returns it.

Here is what cache aside looks like in a real service:

csharp
public class ProductService { private readonly ICacheService _cache; private readonly AppDbContext _dbContext; private static readonly TimeSpan CacheExpiry = TimeSpan.FromMinutes(15); public ProductService(ICacheService cache, AppDbContext dbContext) { _cache = cache; _dbContext = dbContext; } public async Task<ProductDto?> GetProductByIdAsync(int productId) { var cacheKey = $"product:{productId}"; return await _cache.GetOrSetAsync(cacheKey, async () => { var product = await _dbContext.Products .AsNoTracking() .Where(p => p.Id == productId) .Select(p => new ProductDto { Id = p.Id, Name = p.Name, Price = p.Price, Category = p.Category.Name }) .FirstOrDefaultAsync(); return product; }, CacheExpiry); } public async Task<List<ProductDto>> GetProductsByCategoryAsync( string category) { var cacheKey = $"products:category:{category.ToLowerInvariant()}"; return await _cache.GetOrSetAsync(cacheKey, async () => { return await _dbContext.Products .AsNoTracking() .Where(p => p.Category.Name == category) .Select(p => new ProductDto { Id = p.Id, Name = p.Name, Price = p.Price, Category = category }) .ToListAsync(); }, CacheExpiry); } }

The beauty of this pattern is simplicity. You only cache data that is actually requested, so you do not waste memory caching records nobody ever reads. The downside is the cold start problem: the first request for any piece of data always hits the database, and there is a window where the cache data might be stale.

Write Through

In a write through pattern, every write operation updates both the cache and the database. The cache is always consistent with the database, which means reads from the cache never return stale data.

csharp
public async Task UpdateProductAsync(int productId, UpdateProductRequest request) { var product = await _dbContext.Products.FindAsync(productId); if (product is null) throw new NotFoundException($"Product {productId} not found"); product.Name = request.Name; product.Price = request.Price; await _dbContext.SaveChangesAsync(); // Update the cache immediately after the database write var dto = new ProductDto { Id = product.Id, Name = product.Name, Price = product.Price, Category = product.Category?.Name ?? string.Empty }; await _cache.SetAsync($"product:{productId}", dto, TimeSpan.FromMinutes(15)); // Also invalidate the category list cache since it might be stale await _cache.RemoveByPrefixAsync("products:category:"); }

Write through keeps data fresh, but it adds latency to every write operation because you are now writing to two places. It also adds complexity because you need to handle failure scenarios. What happens if the database write succeeds but the Redis write fails? You need to decide how to handle that.

Write Behind (Write Back)

Write behind flips the order. The application writes to the cache first and returns immediately. A separate background process then persists the data to the database asynchronously. This gives you the lowest write latency, but it comes with the risk of data loss if Redis fails before the data is flushed to the database.

Write behind is best suited for scenarios where temporary data loss is acceptable, such as analytics counters, view counts, or non critical metrics. For business critical data like orders or payments, you should stick with cache aside or write through.


Project Structure

Before we go deeper, let us look at how all these pieces fit together in a well organized solution.

Project Folder Structure

The key principle here is that the cache service lives in the Infrastructure layer, not in the API layer. Your controllers and services depend on the ICacheService interface, which means you can swap Redis for an in memory cache during testing without changing any business logic.


Distributed Session Management

One of the most underrated use cases for Redis in .NET applications is distributed session management. If you are running multiple instances of your application behind a load balancer, in memory sessions are a recipe for disaster.

Session Management with Redis

Here is the problem. User hits Server A and logs in. The session is stored in Server A's memory. On the next request, the load balancer sends the user to Server B. Server B has no idea who this user is because the session lives on Server A. The user sees a login page again and has a terrible experience.

Redis solves this elegantly. All servers in your farm read and write sessions from the same Redis instance. It does not matter which server handles the request because the session data is always available.

Configuring Redis Sessions in .NET 10

csharp
var builder = WebApplication.CreateBuilder(args); builder.Services.AddStackExchangeRedisCache(options => { options.Configuration = builder.Configuration .GetConnectionString("Redis"); options.InstanceName = "MyApp:Sessions:"; }); builder.Services.AddSession(options => { options.IdleTimeout = TimeSpan.FromMinutes(30); options.Cookie.HttpOnly = true; options.Cookie.IsEssential = true; options.Cookie.SecurePolicy = CookieSecurePolicy.Always; options.Cookie.SameSite = SameSiteMode.Strict; }); var app = builder.Build(); app.UseSession();

Using Sessions in Your Application

csharp
public class CartController : ControllerBase { [HttpPost("add")] public IActionResult AddToCart([FromBody] CartItemRequest request) { var cart = HttpContext.Session.GetString("cart"); var cartItems = cart is not null ? JsonSerializer.Deserialize<List<CartItem>>(cart) : new List<CartItem>(); cartItems!.Add(new CartItem { ProductId = request.ProductId, Quantity = request.Quantity }); HttpContext.Session.SetString("cart", JsonSerializer.Serialize(cartItems)); return Ok(new { ItemCount = cartItems.Count }); } [HttpGet] public IActionResult GetCart() { var cart = HttpContext.Session.GetString("cart"); if (cart is null) return Ok(new { Items = Array.Empty<CartItem>() }); var cartItems = JsonSerializer.Deserialize<List<CartItem>>(cart); return Ok(new { Items = cartItems }); } }

The session middleware automatically handles serialization and deserialization with Redis. The session ID is stored in a cookie, and the actual session data lives in Redis. This means your web servers stay completely stateless, which is exactly what you want for horizontal scaling.


Pub/Sub Messaging with Redis

Redis Pub/Sub is a messaging system where publishers send messages to channels without knowing who is listening, and subscribers listen to channels without knowing who is publishing. It is not as robust as RabbitMQ or Azure Service Bus for guaranteed delivery, but for fire and forget notifications between microservices, it is incredibly fast and remarkably simple to implement.

Redis Pub/Sub Messaging

Publishing Messages

csharp
public interface IMessagePublisher { Task PublishAsync<T>(string channel, T message, CancellationToken cancellationToken = default); } public class RedisMessagePublisher : IMessagePublisher { private readonly IConnectionMultiplexer _redis; private readonly JsonSerializerOptions _jsonOptions; public RedisMessagePublisher(IConnectionMultiplexer redis) { _redis = redis; _jsonOptions = new JsonSerializerOptions { PropertyNamingPolicy = JsonNamingPolicy.CamelCase }; } public async Task PublishAsync<T>(string channel, T message, CancellationToken cancellationToken = default) { var subscriber = _redis.GetSubscriber(); var serialized = JsonSerializer.Serialize(message, _jsonOptions); await subscriber.PublishAsync( RedisChannel.Literal(channel), serialized); } }

Subscribing to Messages with a Background Service

csharp
public class OrderNotificationSubscriber : BackgroundService { private readonly IConnectionMultiplexer _redis; private readonly IServiceProvider _serviceProvider; private readonly ILogger<OrderNotificationSubscriber> _logger; public OrderNotificationSubscriber( IConnectionMultiplexer redis, IServiceProvider serviceProvider, ILogger<OrderNotificationSubscriber> logger) { _redis = redis; _serviceProvider = serviceProvider; _logger = logger; } protected override async Task ExecuteAsync( CancellationToken stoppingToken) { var subscriber = _redis.GetSubscriber(); await subscriber.SubscribeAsync( RedisChannel.Literal("order.created"), async (channel, message) => { try { var order = JsonSerializer .Deserialize<OrderCreatedEvent>(message!); using var scope = _serviceProvider.CreateScope(); var notificationService = scope.ServiceProvider .GetRequiredService<INotificationService>(); await notificationService .SendOrderConfirmationAsync(order!); } catch (Exception ex) { _logger.LogError(ex, "Failed to process order notification"); } }); // Keep the service running until cancellation await Task.Delay(Timeout.Infinite, stoppingToken); } }

Using Pub/Sub in Practice

csharp
public class OrderService { private readonly AppDbContext _dbContext; private readonly IMessagePublisher _publisher; private readonly ICacheService _cache; public OrderService(AppDbContext dbContext, IMessagePublisher publisher, ICacheService cache) { _dbContext = dbContext; _publisher = publisher; _cache = cache; } public async Task<OrderDto> CreateOrderAsync( CreateOrderRequest request) { var order = new Order { CustomerId = request.CustomerId, Items = request.Items.Select(i => new OrderItem { ProductId = i.ProductId, Quantity = i.Quantity, UnitPrice = i.UnitPrice }).ToList(), TotalAmount = request.Items .Sum(i => i.Quantity * i.UnitPrice), CreatedAt = DateTime.UtcNow }; _dbContext.Orders.Add(order); await _dbContext.SaveChangesAsync(); // Publish the event for other services to consume await _publisher.PublishAsync("order.created", new OrderCreatedEvent { OrderId = order.Id, CustomerId = order.CustomerId, TotalAmount = order.TotalAmount, CreatedAt = order.CreatedAt }); // Invalidate the orders cache await _cache.RemoveByPrefixAsync( $"orders:customer:{request.CustomerId}"); return MapToDto(order); } }

Register the publisher and subscriber:

csharp
builder.Services.AddSingleton<IMessagePublisher, RedisMessagePublisher>(); builder.Services.AddHostedService<OrderNotificationSubscriber>();

One important thing about Redis Pub/Sub that you need to understand: messages are fire and forget. If a subscriber is offline when a message is published, that message is lost. If you need guaranteed delivery, use Redis Streams or a dedicated message broker like RabbitMQ. But for real time notifications, cache invalidation signals, and lightweight event broadcasting, Pub/Sub is perfect.


Free Newsletter

Enjoying the article? Stay in the loop.

  • Production-ready code samples every week
  • In-depth .NET, C# & React tutorials
  • Career tips & dev insights
500+ developers · No spam · Unsubscribe anytime

Join the community

Get new articles delivered every week.

No credit card · No spam · Cancel anytime · Learn more

Rate Limiting with Redis

Rate limiting is another area where Redis shines because of its atomic operations. The INCR command increments a value and returns the new count in a single atomic operation, which means there are no race conditions even under heavy concurrent load.

Implementing a Sliding Window Rate Limiter

csharp
public class RedisRateLimiter { private readonly IConnectionMultiplexer _redis; public RedisRateLimiter(IConnectionMultiplexer redis) { _redis = redis; } public async Task<RateLimitResult> CheckRateLimitAsync( string clientId, int maxRequests, TimeSpan window) { var database = _redis.GetDatabase(); var key = $"ratelimit:{clientId}"; var now = DateTimeOffset.UtcNow.ToUnixTimeMilliseconds(); var windowStart = now - (long)window.TotalMilliseconds; var transaction = database.CreateTransaction(); // Remove entries outside the window _ = transaction.SortedSetRemoveRangeByScoreAsync( key, 0, windowStart); // Add the current request _ = transaction.SortedSetAddAsync(key, now.ToString(), now); // Count requests in the window var countTask = transaction.SortedSetLengthAsync(key); // Set expiry on the key _ = transaction.KeyExpireAsync(key, window); await transaction.ExecuteAsync(); var requestCount = await countTask; return new RateLimitResult { IsAllowed = requestCount <= maxRequests, CurrentCount = (int)requestCount, MaxRequests = maxRequests, RetryAfter = requestCount > maxRequests ? window : null }; } } public record RateLimitResult { public bool IsAllowed { get; init; } public int CurrentCount { get; init; } public int MaxRequests { get; init; } public TimeSpan? RetryAfter { get; init; } }

Rate Limiting Middleware

csharp
public class RateLimitingMiddleware { private readonly RequestDelegate _next; private readonly RedisRateLimiter _rateLimiter; public RateLimitingMiddleware(RequestDelegate next, RedisRateLimiter rateLimiter) { _next = next; _rateLimiter = rateLimiter; } public async Task InvokeAsync(HttpContext context) { var clientId = context.Connection.RemoteIpAddress?.ToString() ?? "unknown"; var result = await _rateLimiter.CheckRateLimitAsync( clientId, maxRequests: 100, window: TimeSpan.FromMinutes(1)); context.Response.Headers["X-RateLimit-Limit"] = result.MaxRequests.ToString(); context.Response.Headers["X-RateLimit-Remaining"] = Math.Max(0, result.MaxRequests - result.CurrentCount) .ToString(); if (!result.IsAllowed) { context.Response.StatusCode = StatusCodes.Status429TooManyRequests; if (result.RetryAfter.HasValue) { context.Response.Headers["Retry-After"] = ((int)result.RetryAfter.Value.TotalSeconds) .ToString(); } await context.Response.WriteAsJsonAsync(new { Error = "Too many requests. Please try again later." }); return; } await _next(context); } }

This sliding window approach is more accurate than a simple fixed window counter because it smooths out burst traffic at the window boundaries. The sorted set stores each request with a timestamp as the score, and before counting, we remove any entries that fall outside the current window.


Performance: Before and After

Numbers tell the story better than words. Here is what a typical mid sized .NET application sees after implementing Redis caching properly.

Performance Before and After Redis

These numbers come from real world observations across multiple production systems. The exact improvements will vary depending on your data access patterns, cache hit ratio, and workload characteristics, but the magnitude of improvement is consistently significant.

The key insight is that caching does not just make your application faster. It also reduces the load on your database server, which means you can defer expensive database scaling decisions. Many teams find that adding a Redis cache layer lets them serve 10 to 30 times more traffic on the same database hardware they already have.


Health Checks and Monitoring

Running Redis in production without proper health checks and monitoring is like driving a car without a dashboard. Everything seems fine until it suddenly is not, and you have no warning.

Redis Health Check and Monitoring

Implementing Redis Health Checks

.NET 10 has excellent built in support for health checks. Let us add a Redis health check that goes beyond a simple ping.

csharp
using Microsoft.Extensions.Diagnostics.HealthChecks; using StackExchange.Redis; public class RedisHealthCheck : IHealthCheck { private readonly IConnectionMultiplexer _redis; public RedisHealthCheck(IConnectionMultiplexer redis) { _redis = redis; } public async Task<HealthCheckResult> CheckHealthAsync( HealthCheckContext context, CancellationToken cancellationToken = default) { try { var database = _redis.GetDatabase(); // Measure latency with a PING var stopwatch = System.Diagnostics.Stopwatch.StartNew(); await database.PingAsync(); stopwatch.Stop(); var latency = stopwatch.ElapsedMilliseconds; // Check server info for memory usage var server = _redis.GetServer( _redis.GetEndPoints()[0]); var info = await server.InfoAsync("memory"); var memorySection = info.FirstOrDefault(); var data = new Dictionary<string, object> { { "latency_ms", latency }, { "connected_clients", _redis.GetCounters().TotalOutstanding }, { "is_connected", _redis.IsConnected } }; if (memorySection is not null) { foreach (var pair in memorySection) { data[pair.Key] = pair.Value; } } if (!_redis.IsConnected) { return HealthCheckResult.Unhealthy( "Redis connection is not established", null, data); } if (latency > 200) { return HealthCheckResult.Unhealthy( $"Redis latency is {latency}ms (threshold: 200ms)", null, data); } if (latency > 50) { return HealthCheckResult.Degraded( $"Redis latency is {latency}ms (threshold: 50ms)", null, data); } return HealthCheckResult.Healthy( $"Redis is healthy. Latency: {latency}ms", data); } catch (Exception ex) { return HealthCheckResult.Unhealthy( "Redis health check failed", ex); } } }

Registering Health Checks

csharp
builder.Services.AddHealthChecks() .AddCheck<RedisHealthCheck>( "redis", failureStatus: HealthStatus.Unhealthy, tags: new[] { "infrastructure", "cache" }); var app = builder.Build(); app.MapHealthChecks("/health", new HealthCheckOptions { ResponseWriter = async (context, report) => { context.Response.ContentType = "application/json"; var result = JsonSerializer.Serialize(new { Status = report.Status.ToString(), Duration = report.TotalDuration.TotalMilliseconds, Checks = report.Entries.Select(e => new { Name = e.Key, Status = e.Value.Status.ToString(), Description = e.Value.Description, Duration = e.Value.Duration.TotalMilliseconds, Data = e.Value.Data }) }); await context.Response.WriteAsync(result); } });

Cache Hit Ratio Tracking

The single most important metric for your caching layer is the cache hit ratio. If your cache is not being hit, it is not helping and you are just adding complexity. Let us build a simple tracker.

csharp
public class CacheMetrics { private long _hits; private long _misses; public void RecordHit() => Interlocked.Increment(ref _hits); public void RecordMiss() => Interlocked.Increment(ref _misses); public double HitRatio { get { var total = _hits + _misses; return total == 0 ? 0 : (double)_hits / total * 100; } } public long Hits => _hits; public long Misses => _misses; }

Then integrate it into your cache service:

csharp
public async Task<T> GetOrSetAsync<T>(string key, Func<Task<T>> factory, TimeSpan? expiry = null, CancellationToken cancellationToken = default) { var cached = await GetAsync<T>(key, cancellationToken); if (cached is not null) { _metrics.RecordHit(); return cached; } _metrics.RecordMiss(); var value = await factory(); await SetAsync(key, value, expiry, cancellationToken); return value; }

Cache Key Management

One of the most overlooked aspects of Redis caching is key management. Bad key naming leads to key collisions, difficulty debugging, and stale data that is impossible to invalidate cleanly.

Building a Cache Key Generator

csharp
public static class CacheKeys { private const string Prefix = "myapp"; public static string Product(int id) => $"{Prefix}:product:{id}"; public static string ProductsByCategory(string category) => $"{Prefix}:products:category:{category.ToLowerInvariant()}"; public static string UserProfile(string userId) => $"{Prefix}:user:profile:{userId}"; public static string UserOrders(string userId, int page) => $"{Prefix}:user:{userId}:orders:page:{page}"; public static string RateLimit(string clientId) => $"{Prefix}:ratelimit:{clientId}"; public static string Session(string sessionId) => $"{Prefix}:session:{sessionId}"; }

There are three rules for good cache keys. First, they should be predictable. Given the same inputs, the key should always be the same. Second, they should be namespaced to prevent collisions between different data types. Third, they should include enough structure that you can invalidate groups of related keys using prefix matching.

Notice how using products:category: as a prefix lets you invalidate all category caches at once with a single RemoveByPrefixAsync call. This is hugely valuable when someone updates a product and you need to refresh all the category listings that might include it.


Common Mistakes and How to Avoid Them

After years of working with Redis in .NET applications, these are the mistakes I see most frequently.

Mistake 1: Creating New Connections Per Request

This is the most common and most damaging mistake. The IConnectionMultiplexer is designed to be a singleton. It internally manages a pool of connections and handles reconnection automatically. If you create a new ConnectionMultiplexer for every request, you will exhaust Redis connection limits within minutes under load.

csharp
// WRONG: Do not do this public class BadCacheService { public async Task<string?> GetAsync(string key) { // This creates a new connection every single time using var redis = await ConnectionMultiplexer .ConnectAsync("localhost:6379"); var db = redis.GetDatabase(); return await db.StringGetAsync(key); } } // CORRECT: Inject the singleton multiplexer public class GoodCacheService { private readonly IDatabase _database; public GoodCacheService(IConnectionMultiplexer redis) { _database = redis.GetDatabase(); } public async Task<string?> GetAsync(string key) { return await _database.StringGetAsync(key); } }

Mistake 2: Caching Without Expiration

If you set cache entries without an expiration time, they will live in Redis forever until you explicitly delete them or Redis runs out of memory. This leads to stale data that never refreshes and steadily growing memory usage.

Always set an expiration. Even if you think the data is permanent, set a long expiration like 24 hours. This gives you a safety net against stale data.

csharp
// WRONG: No expiration await _cache.SetAsync("product:123", product); // CORRECT: Always set an expiration await _cache.SetAsync("product:123", product, TimeSpan.FromMinutes(15));

Mistake 3: Not Handling Redis Failures Gracefully

Redis is an external dependency and it can fail. Your application should treat Redis as a performance optimization, not a hard requirement. If Redis is down, the application should fall back to the database, not crash.

csharp
public async Task<T?> GetAsync<T>(string key, CancellationToken cancellationToken = default) { try { var value = await _database.StringGetAsync(key); if (value.IsNullOrEmpty) return default; return JsonSerializer.Deserialize<T>(value!, _jsonOptions); } catch (RedisConnectionException ex) { _logger.LogWarning(ex, "Redis connection failed for key {Key}. " + "Falling back to database", key); return default; } }

Mistake 4: Storing Large Objects in Redis

Redis stores everything in memory, and memory is expensive. Storing massive serialized objects in Redis negates its speed advantage because serialization and network transfer time dominate the response. Keep your cached values lean. Cache the specific data you need, not entire entity graphs with all their navigation properties.

Mistake 5: Ignoring the Thundering Herd Problem

When a popular cache entry expires, hundreds of concurrent requests simultaneously discover the cache miss and all hit the database at once. This is called the thundering herd problem. The fix is to use a distributed lock so that only one request fetches from the database while others wait.

csharp
public async Task<T> GetOrSetWithLockAsync<T>(string key, Func<Task<T>> factory, TimeSpan expiry) { var cached = await GetAsync<T>(key); if (cached is not null) return cached; var lockKey = $"lock:{key}"; var lockValue = Guid.NewGuid().ToString(); // Try to acquire a distributed lock var acquired = await _database.StringSetAsync( lockKey, lockValue, TimeSpan.FromSeconds(10), When.NotExists); if (acquired) { try { // Double check after acquiring the lock cached = await GetAsync<T>(key); if (cached is not null) return cached; var value = await factory(); await SetAsync(key, value, expiry); return value; } finally { // Only release if we still hold the lock var script = @" if redis.call('get', KEYS[1]) == ARGV[1] then return redis.call('del', KEYS[1]) else return 0 end"; await _database.ScriptEvaluateAsync(script, new RedisKey[] { lockKey }, new RedisValue[] { lockValue }); } } // Another thread is fetching, wait briefly and retry await Task.Delay(100); return await GetOrSetAsync(key, factory, expiry); }

When to Use Redis and When to Avoid It

Redis is not the answer to every performance problem. Be honest with yourself about whether your application actually needs it.

When to Use Redis and When to Avoid It

The honest answer is: if you are building a single server application with low to moderate traffic and your database queries are properly indexed and performing well, you probably do not need Redis. Adding Redis introduces operational complexity, another service to monitor, another point of failure, and another thing for your team to understand. That complexity is only worth it when you are actually hitting a scaling wall.

But if you are running multiple server instances, serving thousands of requests per second, or building real time features, Redis is one of the best tools in the .NET ecosystem. It is battle tested at enormous scale. Stack Overflow serves millions of requests per day with Redis. Twitter uses Redis for its timeline cache. GitHub uses Redis for caching and background job processing.


Production Configuration Tips

Before you deploy Redis to production, there are several configuration settings that deserve attention.

Connection String Best Practices

json
{ "ConnectionStrings": { "Redis": "your-redis-host:6380,password=your-secure-password,ssl=True,abortConnect=false,connectRetry=3,connectTimeout=5000,syncTimeout=5000,asyncTimeout=5000,defaultDatabase=0" } }

For Azure Cache for Redis, always use SSL (port 6380) in production. The abortConnect=false setting is critical because it prevents the application from crashing if Redis is temporarily unreachable during startup. The retry and timeout settings give your application resilience against brief network interruptions.

Connection Multiplexer Events

Monitor connection events to get early warning of Redis issues:

csharp
var multiplexer = ConnectionMultiplexer.Connect(configuration); multiplexer.ConnectionFailed += (sender, args) => { logger.LogError("Redis connection failed: {FailureType} - {Exception}", args.FailureType, args.Exception?.Message); }; multiplexer.ConnectionRestored += (sender, args) => { logger.LogInformation( "Redis connection restored: {EndPoint}", args.EndPoint); }; multiplexer.ErrorMessage += (sender, args) => { logger.LogWarning("Redis error: {Message}", args.Message); };

Testing Your Redis Implementation

Testing cache behavior is critical because caching bugs are some of the hardest to diagnose in production. Fortunately, you can test your cache service without a running Redis instance by mocking the interface.

csharp
public class ProductServiceTests { private readonly Mock<ICacheService> _cacheMock; private readonly Mock<AppDbContext> _dbContextMock; private readonly ProductService _service; public ProductServiceTests() { _cacheMock = new Mock<ICacheService>(); _dbContextMock = new Mock<AppDbContext>(); _service = new ProductService( _cacheMock.Object, _dbContextMock.Object); } [Fact] public async Task GetProductById_WhenCached_ReturnsFromCache() { var expected = new ProductDto { Id = 1, Name = "Test Product", Price = 29.99m }; _cacheMock .Setup(c => c.GetOrSetAsync( "product:1", It.IsAny<Func<Task<ProductDto?>>>(), It.IsAny<TimeSpan?>(), It.IsAny<CancellationToken>())) .ReturnsAsync(expected); var result = await _service.GetProductByIdAsync(1); Assert.Equal(expected.Name, result!.Name); Assert.Equal(expected.Price, result.Price); } }

For integration tests, you can use Testcontainers to spin up a real Redis instance in Docker:

csharp
public class RedisIntegrationTests : IAsyncLifetime { private readonly RedisContainer _container = new RedisBuilder().Build(); public async Task InitializeAsync() { await _container.StartAsync(); } public async Task DisposeAsync() { await _container.DisposeAsync(); } [Fact] public async Task SetAndGet_RoundTrip_ReturnsOriginalValue() { var redis = await ConnectionMultiplexer .ConnectAsync(_container.GetConnectionString()); var cache = new RedisCacheService(redis); var product = new ProductDto { Id = 1, Name = "Test", Price = 10.00m }; await cache.SetAsync("test:product:1", product, TimeSpan.FromMinutes(5)); var result = await cache.GetAsync<ProductDto>( "test:product:1"); Assert.NotNull(result); Assert.Equal("Test", result!.Name); } }

Free Newsletter

Enjoying the article? Stay in the loop.

  • Production-ready code samples every week
  • In-depth .NET, C# & React tutorials
  • Career tips & dev insights
500+ developers · No spam · Unsubscribe anytime

Join the community

Get new articles delivered every week.

No credit card · No spam · Cancel anytime · Learn more

Wrapping Up

Redis is one of those technologies that feels almost magical the first time you see it in action. A query that took 200 milliseconds against the database now takes less than 1 millisecond from Redis. But the magic is not in the speed alone. It is in the architectural possibilities that speed unlocks: distributed sessions that just work, real time messaging between services, rate limiting that handles thousands of concurrent users, and health monitoring that keeps your operations team informed.

The implementation we built in this article is production ready. The cache service abstraction, the caching strategies, the Pub/Sub messaging, the rate limiter, and the health checks are all patterns that work in real applications serving real users. The code is not theoretical. It is the same kind of code running in production systems today.

If you are just getting started with Redis, my recommendation is to start small. Pick one read heavy endpoint in your application, add a cache aside pattern with a 10 to 15 minute expiration, and measure the impact. You will see immediate improvement, and that success will give you the confidence and the business justification to expand Redis usage to other areas of your application.

If you found this article helpful, you might also want to read about Clean Architecture in .NET for structuring your overall application, or the Repository Pattern in .NET 10 for organizing your data access layer. Combining Redis caching with clean architecture and proper data access patterns gives your application a solid foundation that scales.


Further Reading and References

Here are the resources I referred to and recommend for diving deeper into Redis and .NET integration:

  1. Redis Official Documentation for comprehensive coverage of all Redis commands, data types, and configuration options.

  2. StackExchange.Redis GitHub Repository for the official .NET client documentation, configuration reference, and troubleshooting guides.

  3. Microsoft Learn: Distributed Caching in ASP.NET Core for understanding how the IDistributedCache abstraction works in the .NET ecosystem.

  4. Microsoft Learn: Azure Cache for Redis for production deployment guidance on Azure.

  5. Redis University for free courses on Redis data modeling, performance tuning, and advanced features like Redis Streams.

  6. ASP.NET Core Health Checks for the official documentation on implementing and customizing health check endpoints.

  7. Testcontainers for .NET for setting up real Redis instances in your integration test suite.

Share this post

About the Author

Muhammad Rizwan

Muhammad Rizwan

Software Engineer · .NET & Cloud Developer

A passionate software developer with expertise in .NET Core, C#, JavaScript, TypeScript, React and Azure. Loves building scalable web applications and sharing practical knowledge with the developer community.


Did you find this helpful?

I would love to hear your thoughts. Your feedback helps me create better content for the community.

Leave Feedback

Related Articles

Explore more posts on similar topics

Repository Pattern Implementation in .NET 10

Repository Pattern Implementation in .NET 10

A complete walkthrough of implementing the Repository pattern in .NET 10 with Entity Framework Core. This guide covers the generic repository, specific repositories, the Unit of Work pattern, dependency injection, testing, and real production decisions with working C# code.

2026-02-2725 min read
The Complete step by step Guide to Docker in .NET 10

The Complete step by step Guide to Docker in .NET 10

A complete hands on guide to containerizing .NET 10 applications with Docker. This article covers multi-stage Dockerfiles, Docker Compose for full stack local development, layer caching, health checks, container networking, CI/CD pipelines, and security best practices with real examples you can use today.

2026-03-1520 min read
Clean Architecture in .NET - Practical Guide

Clean Architecture in .NET - Practical Guide

A hands-on walkthrough of Clean Architecture in .NET - why it matters, how to structure your projects, and real code examples you can use today. No fluff, no over-engineering, just practical patterns that actually work in production.

2026-02-2416 min read

Patreon Exclusive

Go deeper - exclusive content every month

Members get complete source-code projects, advanced architecture deep-dives, and monthly 1:1 code reviews.

$5/mo
Supporter
  • Supporter badge on website & my eternal gratitude
  • Your name listed on the website as a supporter
  • Monthly community Q&A (comments priority)
  • Early access to every new blog post
Join for $5/mo
Most Popular
$15/mo
Developer Pro
  • All Supporter benefits plus:
  • Exclusive .NET & Azure deep-dive posts (not on blog)
  • Full source-code project downloads every month
  • Downloadable architecture blueprints & templates
  • Private community access
Join for $15/mo
Best Value
$29/mo
Architect
  • All Developer Pro benefits plus:
  • Monthly 30-min 1:1 code review session
  • Priority answers to your architecture questions
  • Exclusive system design blueprints
  • Your name/logo featured on the website
  • Monthly live Q&A sessions
  • Early access to new courses or products
Join for $29/mo
Teams
$49/mo
Enterprise Partner
  • All Architect benefits plus:
  • Your company logo on my website & blog
  • Dedicated technical consultation session
  • Featured blog post about your company
  • Priority feature requests & custom content
Join for $49/mo

Secure billing via Patreon · Cancel anytime · Card & PayPal accepted

View Patreon page →

Your Feedback Matters

Have thoughts on my content, tutorials, or resources? I read every piece of feedback and use it to improve. No account needed. It only takes a minute.

Free Newsletter

Enjoying the article? Stay in the loop.

  • Production-ready code samples every week
  • In-depth .NET, C# & React tutorials
  • Career tips & dev insights
500+ developers · No spam · Unsubscribe anytime

Join the community

Get new articles delivered every week.

No credit card · No spam · Cancel anytime · Learn more