July 20, 2025

Laravel Non-Blocking HTTP Requests in the Background

Performing HTTP requests in the background by moving them out of PHP, creating a truly asynchronous environment where PHP doesn't have to wait for HTTP responses.

In this blog, I’ll write about an issue that I encountered while developing an API at work. This API receives a high volume of traffic and once a resource is created or updated, we send webhook callbacks to our integration partners, notifying them about new changes to URLs that they have registered.

This creates a situation where we send HTTP requests to URLs and servers that we do not control ourselves. The problem we faced was that some of these URLs and servers that we sent requests to were sometimes slow to respond, forcing us to keep connections open, waiting for a response to come back.

To make sure that the application servers would stay responsive, we pushed these outgoing requests to the queue to be processed on separate server instances. We had several queue workers running, constantly processing queue jobs, but since the process of waiting for an HTTP response is blocking in PHP, this still caused the queue to overflow with jobs, leading to latency in our outgoing webhook callbacks.

I needed to come up with a way we could send outgoing HTTP requests without blocking PHP. Let’s look into things I tried and how I went about solving the main issue in the end.

HTTP Timeout

When sending HTTP requests in Laravel, you’d normally do something like this, using either Guzzle directly or Laravel’s elegant HTTP Client. This is how we first sent our outgoing HTTP calls to our integration partners.

// with Guzzle
$client   = new \GuzzleHttp\Client();
$request  = new \GuzzleHttp\Psr7\Request('GET', 'http://httpbin.org/get');
$response = $client->send($request);
return $response->getBody()->getContents();

// with Laravel HTTP Client
return Http::get('http://httpbin.org/get');

To prevent PHP from blocking the process for too long and in an attempt to speed up the processing of the webhook callbacks in the queue, one might be tempted to set a short timeout on the HTTP request.

cURL and thus Guzzle and Laravel’s HTTP Client have two options for setting the timeout of the request. One is connection_timeout, where you set how long the request should be allowed to wait for a connection to get established.

The other one is simply called timeout, or max-time in cURL, and defines how long the whole request is allowed to take from start to finish. If this timeout is reached, the connection will be terminated even if it’s still transferring data.

// with Guzzle
$client   = new \GuzzleHttp\Client();
$request  = new \GuzzleHttp\Psr7\Request('GET', 'http://httpbin.org/get');
$response = $client->send($request, ['timeout' => 20, 'connect_timeout' => 10]);
return $response->getBody()->getContents();

// with Laravel HTTP Client
return Http::timeout(20)->connectTimeout(10)->get('http://httpbin.org/get');

Forcefully terminating the request causes issues because if the response body hasn’t been fully received by the server before this timeout is met, the server will toss away the whole request, meaning that it will never reach the application.

Once the timeout is met, Guzzle or Laravel’s HTTP Client will throw an exception cURL error 28: Resolving timed out after 20 seconds and if the server is behind a reverse proxy, it will log a 499 Client Closed Request error. If accessing a PHP server directly, it will log an error saying something like Invalid request (Unexpected EOF).

When sending HTTP requests, you have to wait for the response to be sure that the request body was successfully delivered and that the server processed it properly by verifying the response status code.

HTTP Promises

Guzzle and thus, also Laravel’s HTTP Client, offer asynchronous and concurrent HTTP requests with promises.

$client  = new \GuzzleHttp\Client();
$request = new \GuzzleHttp\Psr7\Request('GET', 'http://httpbin.org/get');
$promise = $client->sendAsync($request)->then(function ($response) {
    return $response->getBody()->getContents();
});
// ...
// request not sent yet, do some other stuff
// ...
// then send request
$response = $promise->wait()

// or with Laravel's HTTP Client
$promise = Http::async()->get('http://httpbin.org/get');
// ...
// request not sent yet, do some other stuff
// ...
// then send request
$response = $promise->wait()->getBody()->getContents();

Promises are similar to normal requests in that they are blocking the process while waiting for an HTTP response once the ->wait() method is called. However, they’re different in that you can continue execution after creating the request since it will not be sent until you call the ->wait() method. This means that you can create multiple requests, or promises, that you can later send in parallel.

Http::pool(fn (Pool $pool) => [
    $pool->get('http://httpbin.org/delay/1'),
    $pool->get('http://httpbin.org/delay/3'),
    $pool->get('http://httpbin.org/delay/5'),
    $pool->get('http://httpbin.org/delay/2')
]);

If you have scheduled requests that you can predict and batch together, then this is a great way to get good concurrency, since if you were to send 5 requests and 1 of them would be slow, the other 4 requests don’t need to wait for the slow one before they get sent out.

However, in our case, where we’d have a steady flow of requests that we needed to send out in real-time, we couldn’t buffer requests like this and wait until there was a certain number of requests before sending them out, since this would create latency in our webhook callbacks.

HTTP in Forked Processes

Even with promises, PHP will block its execution while waiting for a response. But if we move the request out of PHP and execute it with a cURL command directly on the server in a forked process, PHP would not be involved and we should be able to send out requests much more efficiently.

The server should be able to handle many more requests simultaneously since a cURL process will only use a minimal amount of memory and won’t strain the CPU while it’s waiting for its response from the server. A very basic implementation of this would look something like this.

$curl = 'curl http://httpbin.org/get -o /dev/null -s';
exec($curl . ' > /dev/null 2>&1 & echo $!');

There is one issue with this, and that is that you won’t get any feedback in PHP on what happened with the request. So I decided to write out some of the attributes from the cURL command and pipe those into a Laravel Artisan command to keep track of what happened to the request.

$curl    = "curl -w '%{exitcode} %{response_code}' http://httpbin.org/get -o /dev/null -s";
$artisan = 'xargs php artisan name-of-command';
exec('(' . $curl . ' | ' . $artisan . ') > /dev/null 2>&1 & echo $!');

After adapting this logic to fit nicely into a Laravel app, I did some rigorous testing. I found that 10 concurrent requests, which is more than enough for our use case, got a peak memory usage of 40mb which is about 4mb per process, I also did some stress testing with 100 concurrent requests which wasn’t at all a problem for the CPU but the system got a peak memory usage of 400mb when processing the requests.

Once I was done with my proof of concept and had done a lot of testing, I had a fair amount of code that I thought might be useful for other developers as well, so I decided to refine the logic and make it into a Laravel Package.

Meet Laravel HTTP Background

The package that I put together is available [here]. It implements the concept that I mentioned above with running requests in forked cURL processes directly on the server, complete with various options that are all covered by multiple test cases and documented in the GitHub repository README.md file.

// use it directly
HttpBg::get('http://httpbin.org/get');

// or though a macro registered on the Laravel HTTP Client
Http::background()->get('http://httpbin.org/get');

It’s possible to track the progress of the event through these events emitted during the request lifecycle.

Event::listen(function (HttpBgRequestSending $event) {});
Event::listen(function (HttpBgRequestSent $event) {});
Event::listen(function (HttpBgRequestSuccess $event) {});
Event::listen(function (HttpBgRequestFailed $event) {});
Event::listen(function (HttpBgRequestTimeout $event) {});
Event::listen(function (HttpBgRequestComplete $event) {});

These events can also be used to track failed attempts, manage retries and also notify about failed and timed-out requests. A simple implementation could look something like this.

class AppServiceProvider extends ServiceProvider
{
    public function boot(): void
    {
        Event::listen(function (HttpBgRequestSent $event) {
            $request = $event->request;
            Cache::put($request->id, $request->toArray());
        });

        Event::listen(function (HttpBgRequestFailed|HttpBgRequestTimeout $event) {
            $requestId = $event->requestId;
            $request   = HttpBgRequest::newFromArray(
                Cache::get($requestId, [])
            );

            if (! $request->validateRequest()) {
                return;
            }
            if ($request->maxAttempts > $request->attempts) {
                HttpBg::send($request);
                return;
            }
            Cache::forget($requestId);
            Mail::to('integration@partner.com')
                ->send(new FailedWebHookMail($request->tag));
        });
    }
}

That’s it for this time, please check out the whole repository [here] and please contribute if you’d like to see new features or if you find any bugs or issues.

I hope that this package will be of use not only for me but for other developers out there struggling with not being able to handle lots of outgoing HTTP requests efficiently in their applications.

Until next time, have a good one!