I have a Laravel app, with an underlying database with long sequences. I would like to provide the opportunity for the users to download the database in JSON format. For that, I first get the records, then convert them to JSON format and write to a file. However, when I moved the app to the production server, it gave me
You don't have permission to access /saveJson on this server.
From the logs, is the error message:
[2021-01-07 19:08:50] local.ERROR: Allowed memory size of 1887436800 bytes exhausted (tried to allocate 675803754 bytes) {"userId":1,"exception":"[object] (Symfony\\Component\\Debug\\Exception\\FatalErrorException(code: 1): Allowed memory size of 1887436800 bytes exhausted (tried to allocate 675803754 bytes) at /home/xxx/vendor/league/flysystem/src/Util/MimeType.php:205)
[stacktrace]
The memory_limit in the server is already set to -1 and restarted after setting it.
The relevant parts of the code:
DB::connection()->disableQueryLog();
$large_db = Model::with('regions', 'genes.gene_models.aro_categories')
->where('curated', '<>', 4)
->get();
$db_JSON = $large_db->toJSON();
Storage::disk('public')->put("database_$date.json", $db_JSON);
DB::connection()->enableQueryLog();
According to MySQL, the whole database is around 170M, according to meminfo, the server has 1962MB memory, so I do not totally understand why it cannot load the records - although since the database would be updated, I would like to create it in a way that it won't crash for similar reasons in the future either.
Is there any smarter way to export the database without exceeding the memory limit? Some kind of buffering?
Thank you for any suggestions!
toJSON()
converts the model to an array, then uses PHP's native json_encode()
to convert data to json. Laravel Docs.
This answer gives an estimate of the memory efficiency of moving between PHP arrays and JSON with json_encode()
and json_decode()
. In the calculations of the answerer, PHP arrays consumed approximately 9 times as much memory as the data JSON serialised.
Taking a naive approach to the question, we can assume that we'd need about 10x as much memory as the size of the resulting JSON file to just dump the data to a file (if we're going via a PHP array).
170MiB x 10 is pretty close to your memory limit. Including the extra memory overheads in Laravel's objects, it makes sense that you're blowing through all your available memory.
The best option is to come up with some sort of option to dump the database which skips loading it into PHP objects. I'd suggest writing a MySQL query which writes JSON directly to a file, then returning the resulting JSON file.