Caching Strategies and Configuration
In this lesson, we'll explore how to significantly improve your Nginx performance through intelligent caching strategies. Building on your knowledge of reverse proxy and static content serving, you'll learn to implement various caching mechanisms that reduce server load and accelerate content delivery.
Learning Goals:
- Configure proxy caching for backend applications
- Implement browser caching for static assets
- Set up FastCGI caching for PHP applications
- Understand cache invalidation strategies
- Monitor and troubleshoot cache performance
Understanding Nginx Caching Benefits
Caching stores frequently accessed content in memory or on disk, reducing the need to generate or fetch the same content repeatedly. This provides several key benefits:
- Reduced backend load: Cached responses don't hit your application servers
- Faster response times: Cached content serves directly from Nginx
- Better scalability: Handle more traffic with the same resources
- Improved user experience: Pages load faster for returning visitors
Proxy Cache Configuration
Proxy caching stores responses from backend servers. When multiple users request the same resource, Nginx serves the cached version instead of forwarding requests to upstream servers.
Basic Proxy Cache Setup
http {
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m
max_size=10g inactive=60m use_temp_path=off;
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_valid 200 302 10m;
proxy_cache_valid 404 1m;
proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504;
add_header X-Cache-Status $upstream_cache_status;
}
}
}
The X-Cache-Status header is invaluable for debugging. It shows whether the response was served from cache (HIT), missed cache (MISS), or bypassed cache (BYPASS).
Advanced Cache Controls
location /api/ {
proxy_pass http://api_backend;
proxy_cache my_cache;
# Cache only successful responses for 5 minutes
proxy_cache_valid 200 5m;
# Bypass cache for POST requests
proxy_cache_bypass $request_method = POST;
# Define cache key including request method and URI
proxy_cache_key "$scheme$request_method$host$request_uri";
# Serve stale content while updating
proxy_cache_background_update on;
proxy_cache_use_stale updating;
}
location /static/ {
proxy_pass http://static_backend;
proxy_cache my_cache;
# Longer cache for static assets
proxy_cache_valid 200 302 1h;
proxy_cache_valid 404 5m;
# Minimum number of requests before caching
proxy_cache_min_uses 3;
}
Browser Caching with Expires and Cache-Control
Browser caching tells client browsers how long to store assets locally before requesting them again from the server.
server {
listen 80;
server_name assets.example.com;
# CSS, JS, images - cache for 1 year
location ~* \.(css|js|png|jpg|jpeg|gif|ico|svg|woff|woff2)$ {
expires 1y;
add_header Cache-Control "public, immutable";
add_header Vary "Accept-Encoding";
}
# HTML files - cache for 15 minutes
location ~* \.html$ {
expires 15m;
add_header Cache-Control "public";
}
# API responses - no cache
location /api/ {
expires -1;
add_header Cache-Control "no-cache, no-store, must-revalidate";
add_header Pragma "no-cache";
}
}
Be careful with long cache times for frequently updated assets. Use cache-busting techniques like filename versioning (style.v2.css) when making breaking changes.
FastCGI Cache for PHP Applications
If you're running PHP applications with PHP-FPM, FastCGI caching can dramatically improve performance.
http {
fastcgi_cache_path /var/cache/nginx/fastcgi levels=1:2
keys_zone=php_cache:10m max_size=10g inactive=60m;
}
server {
listen 80;
server_name phpapp.example.com;
set $skip_cache 0;
# Don't cache POST requests
if ($request_method = POST) {
set $skip_cache 1;
}
# Don't cache logged-in users
if ($http_cookie ~* "wordpress_logged_in") {
set $skip_cache 1;
}
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php8.1-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_cache php_cache;
fastcgi_cache_valid 200 301 302 10m;
fastcgi_cache_valid 404 1m;
fastcgi_cache_bypass $skip_cache;
fastcgi_no_cache $skip_cache;
add_header X-FastCGI-Cache $upstream_cache_status;
}
}
Cache Invalidation Strategies
Proper cache invalidation ensures users see fresh content when needed while maintaining performance benefits.
Manual Cache Purging
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_key "$scheme$request_method$host$request_uri";
}
# Purge endpoint (secure this in production!)
location ~ /purge(/.*) {
allow 127.0.0.1;
allow 192.168.1.0/24;
deny all;
proxy_cache_purge my_cache "$scheme$request_method$host$1";
}
}
Cache Segmentation by User Role
map $http_cookie $cache_zone {
~*wordpress_logged_in "no_cache";
default "main_cache";
}
server {
location / {
proxy_pass http://backend;
# Use different cache zones based on user role
proxy_cache $cache_zone;
# Special handling for authenticated users
if ($cache_zone = "no_cache") {
proxy_cache_bypass 1;
proxy_no_cache 1;
}
}
}
Monitoring Cache Performance
Monitor your cache effectiveness to optimize configurations.
# Check disk usage of cache
du -sh /var/cache/nginx/
# Monitor cache hits in real-time
tail -f /var/log/nginx/access.log | grep -o "X-Cache-Status: [A-Z]*" | sort | uniq -c
# Cache status page (restrict access in production)
server {
listen 8080;
server_name localhost;
location /cache-status {
stub_status on;
access_log off;
allow 127.0.0.1;
deny all;
}
}
Common Pitfalls
- Over-caching dynamic content: User-specific or frequently changing content shouldn't be cached for long periods
- Ignoring cache invalidation: Without proper invalidation, users may see stale content
- Insufficient cache storage: Monitor cache disk usage to prevent filling storage
- Caching sensitive data: Ensure authentication tokens and personal data are never cached
- Forgetting Vary headers: Missing Vary headers can cause incorrect content serving for compressed vs uncompressed assets
Summary
Effective caching is one of the most impactful performance optimizations you can implement with Nginx. By combining proxy caching for backend responses, browser caching for static assets, and FastCGI caching for PHP applications, you can dramatically reduce server load and improve user experience. Remember to monitor cache hit ratios and implement proper invalidation strategies to ensure content freshness.
Nginx Caching and Proxy Cache Directives
What does the `proxy_cache_path` directive's `inactive` parameter control?