phplaravelguzzleguzzle6

How to send files with Guzzle without loading them into memory


I have a form where I can upload multiple files to my laravel backend and I wand to send all those files with Guzzle to external API

I'm having an issue where my script is running out of memory if I upload more MB that what memory is available. Error message is

Allowed memory size of ... bytes exhausted (tried to allocate ... bytes)

Unfortunately, I cannot change the memory limit dynamically

Here is the code that I use

// in laravel controller method

/* @var \Illuminate\Http\Request $request */
$files = $request->allFiles();

$filesPayload = [];

foreach ($files as $key => $file) {
    $filesPayload[] = [
        'name'     => $key,
        'contents' => file_get_contents($file->path()),
        // 'contents' => fopen($file->path(), 'r'), // memory issue as well
        'filename' => $file->getClientOriginalName(),
    ];
}

$client = new \GuzzleHttp\Client\Client([
    'base_uri' => '...',
]);

$response = $client->post('...', [
    'headers' => [
        'Accept'         => 'application/json',
        'Content-Length' => ''
    ],
    'multipart' =>  $filesPayload,
]);

I'm using Guzzle 6. In the docs I found example of fopen but this was also throwing memory error

Is there a way to send multiple files with Guzzle without loading them into memory?


Solution

  • I finally managed to make this work by changing

    'contents' => file_get_contents($file->path()),
    

    to

    'contents' => \GuzzleHttp\Psr7\stream_for(fopen($file->path(), 'r'))
    

    With this change files were not loaded into memory and I was able to send bigger files