phppostgresqlpdo

PHP PDO select query failing on large amount of records


I have a PostgreSQL database and I need to query a table that contains 20 million rows. Then I push the result set into a csv and then finally into S3. I have compiled a php script to achieve this. When I limit my query to 6 million rows it succeeds but anything above that pretty much just seems to fail. No error messages logged or display even with all the error checking and parameters enabled. Also increased memory on my Centos instance to 3GB and really no luck. My code follows:

//connect to database
$myPDO = new PDO('pgsql:host=127.0.0.1;port=5433;dbname=test', 'test', 'test');
$myPDO->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_WARNING);

//time and memory usage variables
error_reporting(-1);
$time = microtime(TRUE);
$mem = memory_get_usage();

$stmt = $myPDO->prepare("SELECT * FROM table"); 
$stmt->execute(); 

$u = "export.csv";

    $h = fopen($u, "w+");

     $first = true;

     //counter
     $counter = 0;

//loop through all rows
     while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
     $counter++; 

//export column header
     if ($first) {
         fputcsv($h, array_keys($row));
         $first = false;
     }
         //export content
         fputcsv($h, $row);
     }

     fclose($h);

require('AwsS3Upload.php');

//pdo error output
if (!$stmt) {
    echo "\nPDO::errorInfo():\n";
    print_r($dbh->errorInfo());
}
print_r(array('memory' => (memory_get_usage() - $mem) / (1024 * 1024), 'seconds' => microtime(TRUE) - $time));

update - working code:

$myPDO = new PDO('pgsql:host=127.0.0.1;port=5433;dbname=test', 'test', 'test');
$myPDO->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_WARNING);
$myPDO->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
$myPDO->setAttribute(PDO::ATTR_STRINGIFY_FETCHES, false);

//time and memory usage variables
error_reporting(-1);
$time = microtime(TRUE);
$mem = memory_get_usage();

for($i=0; $i<6; $i++){
    $limit = 5000000;
    $offset = $i * $limit;

$sql = 'SELECT * FROM table ORDER BY :order LIMIT :limit OFFSET :offset';
    $stmt = $myPDO->prepare($sql);
    $stmt->bindParam(':order', $order, PDO::PARAM_INT);
    $stmt->bindParam(':limit', $limit, PDO::PARAM_INT);
    $stmt->bindParam(':offset', $offset, PDO::PARAM_INT);

    $stmt->execute(); 

$u = "$i-export.csv";
    $h = fopen($u, "w+");
    $first = true;
    //counter
    $counter = 0;

    while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
        $counter++; 

        if ($first) {
            fputcsv($h, array_keys($row));
            $first = false;
        }

            fputcsv($h, $row);
        }

        fclose($h);

   require('AwsS3Upload.php');
}

Solution

  • You will have to chunk it into smaller pieces. Something like this:

    for($i=0; $i<6; $i++){
        $limit = 5000000;
        $offset = $i * $limit;
        $stmt = $myPDO->prepare("SELECT * FROM table ORDER BY `id` LIMIT $offset,$limit"); 
        $stmt->execute(); 
    
        // retrieve and write the records into file
    }