Redis Cache with .Net Part 1: Introduction To Redis Cache with .Net

Rashik Hasnat
4 min readApr 26, 2023

Redis Cache

Redis Cache is an open-source, in-memory data structure store that is used as a database, cache, or message broker. It is built with a key-value data model and can store data in various data structures such as strings, hashes, sets, sorted sets with range queries, bitmaps, and geospatial indexes with radius queries. Redis Cache is highly scalable and can be used to support a wide range of applications, from simple key-value stores to high-performance real-time applications.

Content of this blog

In this blog, I’ll try to show you

  1. Installing Redis Cache in your local Machine
  2. Connecting with a Redis cache server in a .Net 6 web API.
  3. Basic operations Using Redis Cache
  4. Using Redis cache as an implementation of IDistributedCache

Codes

The code can be found in my repository RedisCacheUsage.

Installing Redis cache locally

Windows

On a Windows machine, you can either directly download and install the Redis cache by following this blog or you can install it via WSL by following this blog

Ubuntu

You can follow this blog to install Redis in Ubuntu.

Diving into the code

Enough with the theoretical talks. Let’s dive into the code now!

Bootstrapping the .Net Web API

we’ll be using VS code for this demo. You can use any IDE of your choice. To create the boilerplate code, open an empty folder with vs code and execute the command below using the PowerShell terminal:

dotnet new webapi --no-https --framework net6.0 -o RedisCacheUsage

The boiler-plate code will be generated.

Boiler-plate code

Installing the Redis cache package

The most commonly used package to communicate with the Redis cache server in dot net is StackExchange.Redis. Install it as a Nuget package in your startup project.

Connecting with Redis Cache

We’ll be using connection multiplexers to connect with the Redis server. We’ll add the Redis connection string in appsettings.json . The file will look something like

{
"AllowedHosts": "*",
"RedisConnectionString": "localhost:6379"
}

In the startup file, we’ll connect with the Redis cache. In the code below, we’re reading the Redis connection string from appsettings.json and using the connection multiplexer to communicate with the Redis cache.

Checking The creation of Clients

Now if you run the API, the connection should be made on startup. You can verify the connection created by running the power shell command redis-cli client list before and after running your API. After running the API, you’ll notice that there are 2 more new clients created for the Connection Multiplexer. At this moment, I won’t go into much detail about why each connection multiplexer creates 2 clients. You can visit this StackOverflow question’s answers if you’re interested in the reason and how to stop multiple connections.

Check client list

To learn more about handy redis-cli commands, visit the redis-cli documentation

Using Redis Cache From your Controller

Now that we’re done connecting with the Redis server on the startup file, will try using it from controller endpoints and do real operations against the Redis server. In order to access the connection multiplexer created on the startup file, we need to change the method RegisterRedisCache that we created in the file Program.cs. We need to register the multiplexer that we created as an implementation of IConnectionMultiplexer we do this by adding the line below

builder.Services.AddSingleton<IConnectionMultiplexer>(multiplexer);

The method will look like this:

After registering the multiplexer, we can inject it into any controller and use it to interact with the Redis server. We create a controller named CrudController.cs to do CRUD operation on the Redis server. The code is kind of self-explanatory and you can check the endpoints using Swagger.

Demo Video

What’s next

In the next part, we’ll briefly discuss on IDistributedCache using Redis cache as an implementation of IDistributedCache. Then, we’ll try to design a concurrency guard to stop a single event from happening concurrently and improve it step by step.

--

--