laravellaravel-7maatwebsite-excel

Laravel Large CSV import using maatwebsite Excel


I have a CSV file with 30k rows. I have used maatwebsite excel to import CSV in my PGSQL database.

The problem is every time it uploads 10k-12k data in the database, the page gives HTTP ERROR 500

Error

This page isn’t working

localhost is currently unable to handle this request.
HTTP ERROR 500

I have changed the following variables in php.ini

max_execution_time=0
max_input_time=3000
post_max_size=128M

I have tried the below code in ReportsImport

class UserReport implements ToModel, WithCustomCsvSettings, WithChunkReading, WithHeadingRow
{
    public function model(array $row)
    {
          // dd($row);
          return new UserReport([
              'user'     => $row['username'],
              'amount'   => $row['amount']
          ]);
    }
    
    public function getCsvSettings(): array
    {
         return [
              'input_encoding' => 'UTF-8'
         ];
    }
    
    public function chunkSize(): int
    {
        return 1000;
    }
}

How can I resolve this HTTP 500 error?


Solution

  • The problem is that the application is trying to hold too much data in memory. I see you're already using Chunk reading but it appears not to be enough

    To decrease the memory consumption when reading the table data, try decreasing your chunk size.

    To decrease the memory consumption of the models, try adding Batch inserts in addition to chunk reading .