.NET Core Web API - Memory Caching
.NET Core Web API - Memory Caching

How to Use In-Memory Caching for .NET Core Web APIs

Caching is very common to make applications performant and scalable. If a result is already computed by the application, it is cached in a store so that next time when the same request comes, cached result can be fetched instead of processing the request again.

Cache-Aside pattern is a common pattern now a days and some form of caching is used in almost all the applications. There are various caching stores, which can be used for implementing caching.

Some of them may be, a SQL Server database, Redis Cache, etc. Generally production environment may have web farm, consisting of multiple web servers, and hence using a distributed cache store is very common.

Even though distributed caching stores are common, sometimes in memory caching store might be useful while setting up a new project. In memory caching mechanism might also be desired in some small to medium sized applications.

In this article, we will have a look at in memory cache implementation and how it can be used with .NET Core web APIs.

What is in-memory caching ?

In-Memory caching refers to caching the computation results in the memory of machine where the actual application is running. This means that if the web server is restarted, the cache store would be flushed. It might be helpful to use this in early stages of development or on development machine.

In-Memory caching can also be used in a web farm – if sticky sessions are configured. Sticky session means the requests originated from a user session would always be forwarded to the same server.  For example, Azure Web apps use Application Request Routing (ARR) to route all subsequent requests to the same server.

How to use it with .NET Core APIs ?

If we wish to use in-memory caching, basic interface to use is IMemoryCache. This interface can be injected into API controllers. This injected object can be used to see if computed result is already cached. Same object can be used to add new objects into cache.

For using in-memory caching, follow steps given below:

IMemoryCache instance provides below methods:

  • TryGetValue – to check if any value exists for a given key
  • Set – to set a value for a given key

While setting a key, cache options can be set optionally, to set the expiration for the item. This expiration can be sliding or absolute. Below code example shows usage of these methods and expiration is set to be sliding.

Expiration – Sliding vs Absolute

Every cached item might become stale after some time. When the cached object becomes stale, cache does not return the cached computed result. This makes application to process the request again. This fresh result can be cached.

An expiration setting can be applied to cache store or cache item to decide when an item may become stale. Absolute expiration means, no matter what is the frequency of accessing a cached item, it would certainly become stale after a fixed time. For example, if an item is set to expire after 30 seconds, it would expire exactly after 30 seconds of its insertion into cache – no matter how many times that item was fetched from cache in those 30 seconds.

Sliding expiration provides a way to remove the cached items, which are not frequently accessed. If sliding expiration of 30 seconds is enabled on an item, the item would expire only if that item was not accessed in last 30 seconds.

Combination of Sliding and Absolute Expiration

If an item always gets accessed more frequently than its sliding expiration time, then it is a risk that item would never expire. In order to avoid such case, generally sliding expiration is combined with absolute expiration.

When absolute expiration is specified with sliding expiration, a logical OR is used to decide if an item should be marked as expired. If either the sliding expiration interval or the absolute expiration time pass, the item is evicted from the cache.

While setting items to memory cache, options can be set to mark item with either absolute or sliding expirations.

Limits

ASP .NET Core does not put any limits on how much size should be allowed for the cache. Also, caches are generally not able to calculate the size of items. As per documentation, developers are recommended to provide cache size limits.

Generally, while calling AddMemoryCache, options can be set to provide SizeLimit of cache store. This cache size does not have any unit. Then while adding every individual items SetSize can be used to set approximate size of the item being added.

For example, below code shows that total cache size is 1024 items, and the item getting added has size of 1 unit.

Apps can also call Compact or Remove when available memory is limited.

I hope this information was useful. Let me know your thoughts.

Leave a ReplyCancel reply

This Post Has One Comment