May 18, 2025

Serve Cached Laravel Blade Templates directly from Nginx.

Increase response times and reduce the load on your servers by serving cached Laravel Blade templates directly from your Nginx server.

In this blog, I’ll do a quick write-up of an experiment I did on a competence day at work. My goal was to create a caching system for a Laravel application using Laravel Blade which didn’t involve adding more layers to the tech stack like Varnish or other reverse proxy caching servers or caching mechanisms.

The experiment consisted of caching the Laravel Blade template as a static HTML file once it was accessed by a client, allowing Nginx to serve this HTML file on any successive requests. The results I got were quite impressive and greater than I had expected so I decided to write about it and share it here on this blog.

Setup

The goal is to keep it as simple as possible so we won’t do anything extraordinary with the tech stack, containing it to a simple Nginx reverse proxy and a PHP-FPM backend. Both of these processes will run inside their own docker instance on the same server instance but since they both mount the same cache folder from the same server instance, they can both read and write to the cache.

If you’re using this caching system in the cloud with multiple server instances, you’d have to move the cache directory to a shared file system like Google File System or Amazon Elastic File System for all the server instances to be able to modify the shared cache.

To get this cache system up and running we only need to make a few adjustments. These involves adding two new variables to the Nginx config and a cache handler class in the Laravel application, let’s go through the details of the implementation.

Nginx

If you’ve ever been in contact with a Nginx config file, you know that Nginx uses the try_files setting to determine what file to return to incoming requests. This setting can take multiple values and they’re ordered by priority, Nginx will look through them in order until it finds something that it can serve to the client.

For a typical Laravel application, this would look something like try_files $uri /index.php?$query_string;. In this case, Nginx will first try to find a file with the name matching the path in the incoming request, this path automatically gets set in the $uri variable. If it doesn’t find that file, it falls back to using index.php which is the entry point to the Laravel application. So, if it finds a file with a matching name, it could be an HTML file, an image file, a style sheet file, or whatever, it serves it, otherwise, it sends the request to the Laravel application running on the PHP-FPM docker instance in the background.

This means that we simply need to add a new variable containing the path to the cache to make Nginx browse that directory for a cached HTML file of a Laravel Blade template. If there’s no file then a new file will be generated when serving the request from the Laravel application. This generated cached file will then be returned to any following requests meaning that the incoming requests will not reach the Laravel application and will be served for as long as it’s available.

The second variable we’ll add is a variable that checks whether the “Cache-Control” header is set to “no-cache” or not and will help with the invalidation of the cache. If this header has the “no-cache” value, Nginx will skip using the cache and pass on the request to the Laravel application so that it can regenerate the cached Laravel Blade template file, keeping it up-to-date. Setting the Cache-Control header can be done explicitly in your request but it’s also automatically set when you tick the “Disable Cache” checkbox in Chrome DevTools.

These are the changes we need to make to our Nginx config file.

# Set file cache path variable
# when visiting http://localhost/products/212?page=1
# this will be set to cache/localhost/products/212?page=1.html
map $request_uri $file_cache_path {
    / cache/${host}/index.html;
    default cache/${host}${request_uri}.html;
}

# When the Cache-Control: no-cache header is set
# (explicitly or through ticking the "Disable Cache" checkbox in Chrome DevTools)
# we need to change the try_files to a file that doesn't exist so that
# nginx won't find the cached file, triggering a revalidation of the cached file
map $http_cache_control $try_files_file_cache_path {
    no-cache non-existant-path;
    default $file_cache_path;
}

# try_files is changed to include the new $try_files_file_cache_path variable
try_files /$try_files_file_cache_path $uri /index.php?$query_string;

# pass the file cache path variable to our Laravel application running in
# PHP-FPM, so that we can access it as an environment variable, like so:
# env('FILE_CACHE_PATH')
fastcgi_param FILE_CACHE_PATH $file_cache_path;

The last line in the config changes above makes it easier to fetch the correct path to the cache by passing it as an environment variable to our Laravel application through fastcgi_param, this way we can simply access it by doing a env('FILE_CACHE_PATH') call in our Laravel app.

Laravel Blade

Now that we have the FILE_CACHE_PATH environment variable available in our PHP application, we can change our return view(...) calls to return new ViewFileCache()->cacheView(...) for the routes that we want to cache. This ViewFileCache class is a file that contains logic for creating and invalidating the file cache.

When purging the cache I use Laravel’s Lottery class to check if the cache has transcended the max size limit with an odds of 1 to 10 and if it has, it will remove cached files from oldest to newest until the total cache file size is back to under its maximum threshold.

<?php declare(strict_types=1);

namespace App\Cache;

use App\Models\Category;
use App\Models\Product;
use Illuminate\Support\Facades\File;
use Illuminate\Support\Facades\Storage;
use Illuminate\Support\Lottery;

class ViewFileCache
{
    protected const CACHE_MAX_SIZE = 100000000; // 100 MB

    public function cacheView(string $view, array $data = []): string
    {
        $this->purgeOldCache();
        return $this->storeViewCache($view, $data);
    }

    protected function purgeOldCache(): void
    {
        // if cache size is getting too large, remove old files
        Lottery::odds(1, 10)->winner(function () {
            $allFiles    = collect(File::allFiles(public_path('cache')));
            $totalSize   = $allFiles->sum(fn ($file) => $file->getSize());
            $oldestFirst = $allFiles->sortBy(
                function ($file) { return $file->getCTime(); }
            );

            while ($totalSize > self::CACHE_MAX_SIZE
                && $oldestFirst->count() > 0) {
                $oldestFile = $oldestFirst->shift();
                $totalSize  = $totalSize - $oldestFile->getSize();
                File::delete($oldestFile->getPathname());
            }
        })->choose();
    }

    protected function storeViewCache(string $view, array $data): string
    {
        $view = view($view, $data)->render();
        Storage::disk('local')->put(env('FILE_CACHE_PATH'), $view);
        return $view;
    }
}

Cache Invalidation

Cache invalidation and naming things are said to be the two difficult things in Computer Science, thanks to AI, naming things might not be on that list anymore. Let’s see how we can handle cache invalidation in the cache system that I set up in this experiment.

As we saw in the Nginx config, when we set the Cache-Control: no-cache header in our request, Nginx will not serve the cached file and the request will be passed on to the PHP-FPM backend which will generate a new version of the cached file.

However, we also need to invalidate and remove files related to the requested page or the cache will become stale. That involves removing all pages that were accessed with parameters and also all listing views that contain the entity that got updated.

In my example, I asked GitHub Copilot to create a simple product catalog app with 1000 products all assigned to 100 different categories. In the app that got created, there’s a category listing view that contains a list of products assigned to that specific category. The cache for this category view also needs to be purged when updating a product or the category listing page will contain invalid and obsolete data.

To take care of the purging of the entity itself and all related cached files, I created middleware and a model observer. The middleware checks if the Cache-Control: no-cache header is set, if so, it will regenerate the cache for the view that is requested and also purge all related category listing views. The model observer listens for update, create and delete events on the product model and does the same invalidation of the file cache.

Middleware

<?php declare(strict_types=1);

namespace App\Http\Middleware;

use App\Cache\ViewFileCache;
use Closure;
use Illuminate\Http\Request;

class DeleteProductCache
{
    public function handle(Request $request, Closure $next)
    {
        $rawCacheHeader = app('request')->header('cache-control');
        $cacheHeader    = strtolower(strval($requestHeader));
        $product        = $request->route()->parameter('product');

        if ($cacheHeader === 'no-cache' && ! is_null($product)) {
            (new ViewFileCache)->deleteProductCache($product);
        }
        return $next($request);
    }
}

Model Observer

<?php declare(strict_types=1);

namespace App\Observers;

use App\Cache\ViewFileCache;
use App\Models\Product;

class ProductObserver
{
    public function updated(Product $product): void
    {
        (new ViewFileCache)->deleteProductCache($product);
    }

    public function created(Product $product): void
    {
        (new ViewFileCache)->deleteCategoryCache($product->category);
    }

    public function deleted(Product $product): void
    {
        (new ViewFileCache)->deleteProductCache($product);
    }
}

Cache invalidation methods in the ViewFileCache class

<?php declare(strict_types=1);

namespace App\Cache;

use App\Models\Category;
use App\Models\Product;
use Illuminate\Support\Facades\File;
use Illuminate\Support\Facades\Storage;

class ViewFileCache
{
    // [...]

    public function deleteProductCache(Product $product): void
    {
        $domainPath  = env('FILE_CACHE_HOST');
        $productPath = route('products.show', $product, false);
        $cachePath   = $domainPath . $productPath . '*';
        Storage::disk('local')->delete(File::glob($cachePath));
    }

    public function deleteCategoryCache(Category $category): void
    {
        $domainPath   = env('FILE_CACHE_HOST');
        $categoryPath = route('categories.show', $category, false);
        $cachePath    = $domainPath . $categoryPath . '*';
        Storage::disk('local')->delete(File::glob($cachePath));

        // delete product cache for this category
        $products = Product::where('category_id', $category->id)->get();
        $products->map(
            fn (Product $product) => $this->deleteProductCache($product)
        );
    }

    // [...]
}

Results

The answer to the question that we’re all waiting for is of course, was it worth it? To find out, I used the command-line tool siege, set to use 10 concurrent clients all making 100 requests each, and I must say that the difference was quite remarkable.

without cache: 26.49 secs
with cache:     1.10 secs

Also, since the requests do not reach the PHP backend, it will save loads of resources on the servers while being able to handle more requests. The transaction rate for this test on my local machine was a whopping 909 transactions per second. This is compared to serving the template from PHP without cache, 38 transactions per second.

I enjoyed this experiment because I like to think of hacks that you can apply to tools that you use every day, altering the way you use them to improve the user experience and performance while keeping it simple, not complicating things by adding layers of complexity. This is a simple starting point but the caching and invalidation logic can of course be improved and altered to fit any application and implementation.

That’s it for this time, I hope this was a fun read and that it will give someone out there inspiration to try this out and maybe even build a better more efficient caching logic based on this concept.

As always, you can find the whole repository used in this article [here].

Until next time, have a good one!