Lesson 24 of 40 Performance Advanced 50 min

Caching Strategies

In this lesson, you will learn how caching reduces repeated work, improves application speed, and introduces design trade-offs around freshness, invalidation, and storage location.

← Back to Visual Studio 2026 Tutorial Home

What you will learn

Why this matters: Caching can improve performance dramatically, but poorly designed caching can also create stale results, hidden bugs, and wasted memory.

Part 1: Why caching helps

Caching stores previously computed or retrieved data so the application does not need to perform the same expensive work every time. This can reduce database calls, external API usage, and rendering costs.

Typical things worth caching include:

Part 2: In-memory caching

In-memory caching is simple and fast because cached data stays inside the application process. It works well for single-instance applications or when per-instance cache duplication is acceptable.

if (!_cache.TryGetValue("products", out List products)) { products = await repository.GetProductsAsync(); _cache.Set("products", products, TimeSpan.FromMinutes(5)); }

This is easy to implement, but each application instance will maintain its own copy.

Part 3: Distributed caching

Distributed caching stores cached data outside the individual application process, often in a shared cache such as Redis. This is useful when multiple application instances need consistent cached values.

Part 4: Expiration and invalidation

One of the hardest parts of caching is deciding when cached data should expire or be refreshed. Cached data that is too old can make the application incorrect, even if it is fast.

Question Why it matters
How long should it stay cached? Too short reduces value; too long risks stale data
What event should invalidate it? Updates may need to clear or refresh the cache
Should different users share the same cache entry? Some content is global, some is user-specific
Classic challenge: caching is often easy to add, but invalidation is where design quality really matters.

Part 5: Common caching patterns

The right pattern depends on whether your main concern is speed, consistency, simplicity, or operational control.

When to cache and when not to

Situation Guidance
Frequently requested stable data Good caching candidate
Highly volatile data Cache cautiously or keep duration short
User-specific sensitive data Use careful scoping or avoid shared caching
Cheap operations Caching may add more complexity than value

A practical caching workflow

Step 1: Identify expensive or repeated work
Step 2: Choose cache scope and storage type
Step 3: Define expiration and invalidation rules
Step 4: Measure whether caching actually improves performance
Step 5: Monitor memory and correctness impacts
Step 6: Refine based on real usage patterns

Best practices

Summary

In this lesson, you learned how caching improves performance, how in-memory and distributed caching differ, and why expiration and invalidation are central to good cache design.

In the next lesson, you will move into configuration and the options pattern.