Optimizing Performance with Caching in .NET REST APIs

Optimizing Performance with Caching in .NET REST APIs

·

9 min read

Optimizing Performance with Caching in .NET REST APIs

In the world of web development, performance is a crucial factor that can make or break an application. One of the most effective strategies to enhance performance is caching. Through caching, we can reduce the load on our servers, decrease response times, and improve the overall user experience.

In this blog post, I will share my experience with optimizing .NET REST APIs using caching, including a couple of practical techniques with code snippets to demonstrate their implementation.

Introduction

Performance optimization is a key consideration when building REST APIs with .NET. Users expect fast and reliable responses, and slow APIs can lead to frustration and decreased usage.

Caching is a technique for storing frequently accessed data in a temporary storage location, reducing the need to fetch the same data repeatedly from the database or other sources. By leveraging caching, we can significantly improve the performance of our APIs.

How do we know current performances?

You can use various Treblle tools to monitor API performance. To be able to follow, be sure to:

  1. Register/Login to Treblle Web Monitoring App.

  2. Check the Api Insights app

  3. (Optional): Read this article on how to use the Api Insights extension directly in Visual Studio Code.

Optimizing Performance with Caching in .NET REST APIs

API Score dashboard in Treblle

From the Dashboard in the Treblle Web Monitoring app, you can see your API score. One of the things that is measured is performance. The "Read more" button allows you to check all the details.


Optimizing Performance with Caching in .NET REST APIs

Monitor and optimize your API performance effortlessly with Treblle

Request a Demo


Okay, let's take a look at some recommendations for caching.

Cache Guidelines

Microsoft gave a nice list of the rules for implementing a cache in the right way:

  • Code should always have a fallback option to fetch data and not depend on an available cached value.

  • The cache uses a scarce resource, memory. Limit cache growth:

    • Do not insert external input into the cache. For example, using arbitrary user-provided input as a cache key is not recommended since the input might consume unpredictable memory.

    • Use expirations to limit cache growth.

    • Use SetSize, Size, and SizeLimit to limit cache size. The ASP.NET Core runtime does not limit cache size based on memory pressure. It's up to the developer to limit cache size.

Let's take a look at some implementations.

Technique 1: In-Memory Caching

One of the simplest and most effective caching strategies is in-memory caching. In-memory caching stores data directly in the application server's memory, providing extremely fast access times. This technique is ideal for small to medium-sized datasets that are frequently accessed.

To implement in-memory caching in a .NET REST API, we can use the IMemoryCache interface provided by the Microsoft.Extensions.Caching.Memory package.

Here’s a step-by-step guide to implementing in-memory caching:

Step 1: Install the necessary package

First, we need to install the Microsoft.Extensions.Caching.Memory package:

dotnet add package Microsoft.Extensions.Caching.Memory

Next, we need to configure the caching services in our Program.cs file:

Step 2: Configure services in the Program.cs

builder.Services.AddMemoryCache();

Now, we can use the IMemoryCache interface in our controller to cache data.

Step 3: Use IMemoryCache in your controller

using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Caching.Memory;
using System;

namespace MyApi.Controllers
{
    [ApiController]
    [Route("api/[controller]")]
    public class ProductsController : ControllerBase
    {
        private readonly IMemoryCache _memoryCache;

        public ProductsController(IMemoryCache memoryCache)
        {
            _memoryCache = memoryCache;
        }

        [HttpGet("{id}")]
        public IActionResult GetProduct(int id)
        {
            string cacheKey = $"Product_{id}";
            if (!_memoryCache.TryGetValue(cacheKey, out Product product))
            {
                // Simulate fetching data from a database
                product = FetchProductFromDatabase(id);

                // Set cache options
                var cacheOptions = new MemoryCacheEntryOptions
                {
                    AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
                    SlidingExpiration = TimeSpan.FromMinutes(2)
                };

                // Save data in cache
                _memoryCache.Set(cacheKey, product, cacheOptions);
            }

            return Ok(product);
        }

        private Product FetchProductFromDatabase(int id)
        {
            // Simulate database fetch
            return new Product { Id = id, Name = "Sample Product", Price = 99.99m };
        }
    }

    public class Product
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public decimal Price { get; set; }
    }
}

In this example, we first check if the product is already in the cache. If it is not, we fetch it from the database (simulated here) and then store it in the cache with specific cache options.

My experience

From my experience, IMemoryCache in .NET is great for quickly boosting performance because it stores data in memory, making it easy to access. However, it's limited by the amount of memory on the server and doesn't work well in distributed systems since it doesn't keep data if the application restarts. I've found that using something like Redis is a better option for larger applications or those needing to share data across multiple servers.

Technique 2: Distributed Caching with Redis

In-memory caching might not be sufficient for larger applications or distributed systems. In such cases, we can use distributed caching.

Redis is a popular choice for distributed caching due to its high performance and scalability.
Let's see how to implement it.

Step 1: Install the necessary packages

First, we need to install the Microsoft.Extensions.Caching.StackExchangeRedis package:

dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis

Step 2: Configure services in the Program.cs

builder.Services.AddStackExchangeRedisCache(options => { 
     options.Configuration = "localhost:6379"; 
     options.InstanceName = "SampleInstance"; 
});

Step 3: Use IDistributedCache in your controller

Now, we can use the IDistributedCache interface in our controller to cache data:

using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Caching.Distributed;
using System;
using System.Text.Json;
using System.Threading.Tasks;

namespace MyApi.Controllers
{
    [ApiController]
    [Route("api/[controller]")]
    public class ProductsController : ControllerBase
    {
        private readonly IDistributedCache _distributedCache;

        public ProductsController(IDistributedCache distributedCache)
        {
            _distributedCache = distributedCache;
        }

        [HttpGet("{id}")]
        public async Task<IActionResult> GetProduct(int id)
        {
            string cacheKey = $"Product_{id}";
            var productJson = await _distributedCache.GetStringAsync(cacheKey);
            Product product;

            if (string.IsNullOrEmpty(productJson))
            {
                // Simulate fetching data from a database
                product = FetchProductFromDatabase(id);

                // Save data in cache
                productJson = JsonSerializer.Serialize(product);
                var cacheOptions = new DistributedCacheEntryOptions
                {
                    AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
                    SlidingExpiration = TimeSpan.FromMinutes(2)
                };

                await _distributedCache.SetStringAsync(cacheKey, productJson, cacheOptions);
            }
            else
            {
                product = JsonSerializer.Deserialize<Product>(productJson);
            }

            return Ok(product);
        }

        private Product FetchProductFromDatabase(int id)
        {
            // Simulate database fetch
            return new Product { Id = id, Name = "Sample Product", Price = 99.99m };
        }
    }

    public class Product
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public decimal Price { get; set; }
    }
}

In this example, we use the IDistributedCache interface to cache the product data in Redis. We serialize the product to JSON before storing it in the cache and deserialize it when retrieving it from the cache.

From my experience, IDistributedCache is excellent for applications that need to share cached data across multiple servers. It provides better scalability and persistence compared to IMemoryCache. With IDistributedCacheYou can use solutions like Redis, which stores data outside the application and ensures it remains available after restart. However, setting it up is more complex and may introduce additional overhead, but the benefits for larger, distributed systems make it worth the effort.

Pros of Caching

  1. Improved Performance:

    • Faster Data Retrieval: Caching stores frequently accessed data in a fast storage medium, reducing the time needed to fetch data from slower primary sources like databases or external APIs.

    • Reduced Latency: Since the data is stored closer to the application (in-memory or nearby cache servers), the latency for data retrieval is significantly reduced.

  2. Reduced Load on Primary Sources:

    • Database Offloading: By serving requests from the cache, the load on the primary data source, such as a database, is reduced, leading to better performance and scalability.

    • Resource Efficiency: This offloading allows for more efficient use of server resources, which can be allocated to other tasks.

  3. Scalability:

    • Horizontal Scalability: Distributed caching solutions like Redis or Memcached can be scaled horizontally, allowing the application to handle more concurrent users and larger datasets.

    • Improved Availability: Caches can provide high data availability, even if the primary data source experiences downtime.

  4. Cost Efficiency:

    • Reduced Infrastructure Costs: By reducing the load on primary sources, caching can help reduce the need for expensive database infrastructure scaling.

    • Improved Resource Utilization: Efficient caching strategies can better use existing resources, delaying the need for additional investments.

Cons of Caching

  1. Cache Invalidation:

    • Stale Data: One of the main challenges of caching is ensuring that the cached data is up-to-date. Incorrectly managed cache invalidation can render users outdated or stale data.

    • Complexity: Managing cache expiration policies and ensuring data consistency adds complexity to the application.

  2. Memory Overhead:

    • Resource Consumption: Caching consumes memory or disk space. For in-memory caches, this can lead to increased memory usage, which might concern resource-constrained environments.

    • Management Overhead: Distributed caches require additional infrastructure and management, which can introduce additional overhead.

  3. Cache Misses:

    • Performance Penalty: When a cache miss occurs (data is not found in the cache), the primary data source must handle the request, potentially leading to higher latency.

    • Cold Starts: After a cache is initially deployed or cleared, the cache may experience many misses until it is sufficiently populated with frequently accessed data.

  4. Complexity in Multi-Tier Architectures:

    • Synchronization: In multi-tier architectures, keeping caches in sync across different layers or services can be challenging, especially when dealing with distributed systems.

    • Consistency: Ensuring strong consistency between cached data and the primary data source requires careful design and may involve trade-offs between performance and consistency.

  5. Security Risks:

    • Sensitive Data: Storing sensitive data in caches without proper encryption or access controls can expose it to unauthorized access or data breaches.

    • Data Leakage: Misconfigured caches can inadvertently leak data, especially in shared caching environments.

Conclusion

Caching offers significant performance, scalability, and cost-efficiency benefits.

However, it also introduces challenges related to cache invalidation, memory overhead, and complexity in maintaining consistency. By carefully designing and managing caching strategies, developers can harness the power of caching while mitigating its drawbacks.
Also, if you need to optimize your .NET API from another perspective, be sure to read my article at https://blog.treblle.com/performance-optimization-techniques-for-net-apis/.

And yes, remember to monitor your APIs. That's the only way you'll find out why and what you need to optimize.

Start optimizing your .NET APIs today with Treblle's powerful observability and optimization tools!

Talk to our API Expert

Frequently Asked Questions (FAQ)

What is caching in the context of .NET REST APIs?

Caching in .NET REST APIs is a technique for storing frequently accessed data in a temporary storage location. This reduces the need to fetch the same data repeatedly from the database, improving API performance by decreasing server load and response times.

How can you implement in-memory caching in a .NET REST API?

To implement in-memory caching in a .NET REST API, use the IMemoryCache interface from Microsoft.Extensions.Caching.Memory package. Install the package, configure the caching services in Program.cs, and use IMemoryCache in the controller to store and retrieve cached data.

What are the pros and cons of caching?

Caching has many advantages, including improved performance, reduced latency, offloaded database load, scalability, and cost efficiency. However, it also has disadvantages, including challenges with cache invalidation, memory overhead, cache misses, complexity in multi-tier architectures, and security risks.

What is the difference between in-memory caching and distributed caching?

In-memory caching stores data in the memory of a single application server, offering fast access but limited scalability and persistence. Distributed caching, such as Redis, stores data across multiple servers, providing better scalability and persistence, making it suitable for larger applications and distributed systems.