phpjsoncurl-multi

Pulling data from API, memory growth


I'm working on a project where I pull data (JSON) from an API. The problem I'm having is that the memory is slowly growing until I get the dreaded fatal error:

Fatal error: Allowed memory size of * bytes exhausted (tried to allocate * bytes) in C:... on line *

I don't think there should be any memory growth. I tried unsetting everything at the end of the loop but no difference. So my question is: am I doing something wrong? Is it normal? What can I do to fix this problem?

<?php

$start = microtime(true);

$time = microtime(true) - $start;
echo "Start: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br/>";

include ('start.php');
include ('connect.php');

set_time_limit(0);

$api_key = 'API-KEY';
$tier = 'Platinum';
$threads = 10; //number of urls called simultaneously

function multiRequest($urls, $start) {

    $time = microtime(true) - $start;
    echo "&nbsp;&nbsp;&nbsp;start function: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";

    $nbrURLS = count($urls); // number of urls in array $urls
    $ch = array(); // array of curl handles
    $result = array(); // data to be returned

    $mh = curl_multi_init(); // create a multi handle 

    $time = microtime(true) - $start;
    echo "&nbsp;&nbsp;&nbsp;Creation multi handle: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";

    // set URL and other appropriate options
    for($i = 0; $i < $nbrURLS; $i++) {
        $ch[$i]=curl_init();

        curl_setopt($ch[$i], CURLOPT_URL, $urls[$i]);
        curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, 1); // return data as string
        curl_setopt($ch[$i], CURLOPT_SSL_VERIFYPEER, 0); // Doesn't verifies certificate

        curl_multi_add_handle ($mh, $ch[$i]); // Add a normal cURL handle to a cURL multi handle
    }

    $time = microtime(true) - $start;
    echo "&nbsp;&nbsp;&nbsp;For loop options: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";

    // execute the handles
    do {
        $mrc = curl_multi_exec($mh, $active);          
        curl_multi_select($mh, 0.1); // without this, we will busy-loop here and use 100% CPU
    } while ($active);

    $time = microtime(true) - $start;
    echo "&nbsp;&nbsp;&nbsp;Execution: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";

    echo '&nbsp;&nbsp;&nbsp;For loop2<br>';

    // get content and remove handles
    for($i = 0; $i < $nbrURLS; $i++) {

        $error = curl_getinfo($ch[$i], CURLINFO_HTTP_CODE); // Last received HTTP code 

        echo "&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;error: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";

        //error handling if not 200 ok code
        if($error != 200){

            if($error == 429 || $error == 500 || $error == 503 || $error == 504){
                echo "Again error: $error<br>";
                $result['again'][] = $urls[$i];

            } else {
                echo "Error error: $error<br>";
                $result['errors'][] = array("Url" => $urls[$i], "errornbr" => $error);
            }

        } else {
            $result['json'][] = curl_multi_getcontent($ch[$i]);

            echo "&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Content: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";
        }

        curl_multi_remove_handle($mh, $ch[$i]);
        curl_close($ch[$i]);
    }

    $time = microtime(true) - $start;
    echo "&nbsp;&nbsp;&nbsp; after loop2: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br>";

    curl_multi_close($mh);

    return $result;
}


$gamesId = mysqli_query($connect, "SELECT gameId FROM `games` WHERE `region` = 'EUW1' AND `tier` = '$tier ' LIMIT 20 ");
$urls = array();

while($result = mysqli_fetch_array($gamesId))
{
    $urls[] = 'https://euw.api.pvp.net/api/lol/euw/v2.2/match/' . $result['gameId'] . '?includeTimeline=true&api_key=' . $api_key;
}

$time = microtime(true) - $start;
echo "After URL array: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br/>";

$x = 1; //number of loops

while($urls){ 

    $chunk = array_splice($urls, 0, $threads); // take the first chunk ($threads) of all urls

    $time = microtime(true) - $start;
    echo "<br>After chunk: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br/>";

    $result = multiRequest($chunk, $start); // Get json

    unset($chunk);

    $nbrComplete = count($result['json']); //number of retruned json strings

    echo 'For loop: <br/>';

    for($y = 0; $y < $nbrComplete; $y++){
        // parse the json
        $decoded = json_decode($result['json'][$y], true);

        $time = microtime(true) - $start;
        echo "&nbsp;&nbsp;&nbsp;Decode: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "<br/>";


    }

    unset($nbrComplete);
    unset($decoded);

    $time = microtime(true) - $start;
    echo $x . ": ". memory_get_peak_usage(true) . " | " . $time . "<br>";

    // reuse urls
    if(isset($result['again'])){
        $urls = array_merge($urls, $result['again']);
        unset($result['again']);
    }

    unset($result);
    unset($time);

    sleep(15); // limit the request rate

    $x++;
}

include ('end.php');

?>

PHP Version 5.3.9 - 100 loops:

loop: memory | time (sec)
1: 5505024 | 0.98330211639404
3: 6291456 | 33.190237045288
65: 6553600 | 1032.1401019096
73: 6815744 | 1160.4345710278
75: 7077888 | 1192.6274609566
100: 7077888 | 1595.2397520542

EDIT:
After trying it with PHP 5.6.14 xampp on windows:

loop: memory | time (sec)
1: 5505024 | 1.0365679264069
3: 6291456 | 33.604479074478
60: 6553600 | 945.90159296989
62: 6815744 | 977.82566595078
93: 7077888 | 1474.5941500664
94: 7340032 | 1490.6698410511
100: 7340032 | 1587.2434458733

EDIT2: I only see the memory increase after json_decode

Start: 262144 | 135448
After URL array: 262144 | 151984
After chunk: 262144 | 152272
   start function: 262144 | 152464
   Creation multi handle: 262144 | 152816
   For loop options: 262144 | 161424
   Execution: 3145728 | 1943472
   For loop2
      error: 3145728 | 1943520
      Content: 3145728 | 2095056
      error: 3145728 | 1938952
      Content: 3145728 | 2131992
      error: 3145728 | 1938072
      Content: 3145728 | 2135424
      error: 3145728 | 1933288
      Content: 3145728 | 2062312
      error: 3145728 | 1928504
      Content: 3145728 | 2124360
      error: 3145728 | 1923720
      Content: 3145728 | 2089768
      error: 3145728 | 1918936
      Content: 3145728 | 2100768
      error: 3145728 | 1914152
      Content: 3145728 | 2089272
      error: 3145728 | 1909368
      Content: 3145728 | 2067184
      error: 3145728 | 1904616
      Content: 3145728 | 2102976
    after loop2: 3145728 | 1899824
For loop: 
   Decode: 3670016 | 2962208
   Decode: 4980736 | 3241232
   Decode: 5242880 | 3273808
   Decode: 5242880 | 2802024
   Decode: 5242880 | 3258152
   Decode: 5242880 | 3057816
   Decode: 5242880 | 3169160
   Decode: 5242880 | 3122360
   Decode: 5242880 | 3004216
   Decode: 5242880 | 3277304

Solution

  • I tested your script on 10 URLS. I removed all your comments except one comment at the end of the script and one in problem loop when used json_decode. Also I opened one page which you encode from API and looked very big array and I think you're right, you have an issue in json_decode.

    Results and fixes.

    Result without changes:

    Code:

    for($y = 0; $y < $nbrComplete; $y++){
       $decoded = json_decode($result['json'][$y], true);
       $time = microtime(true) - $start;
       echo "Decode: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "\n";
    }
    

    Result:

    Decode: 3407872 | 2947584
    Decode: 3932160 | 2183872
    Decode: 3932160 | 2491440
    Decode: 4980736 | 3291288
    Decode: 6291456 | 3835848
    Decode: 6291456 | 2676760
    Decode: 6291456 | 4249376
    Decode: 6291456 | 2832080
    Decode: 6291456 | 4081888
    Decode: 6291456 | 3214112
    Decode: 6291456 | 244400
    

    Result with unset($decode):

    Code:

    for($y = 0; $y < $nbrComplete; $y++){
       $decoded = json_decode($result['json'][$y], true);
       unset($decoded);
       $time = microtime(true) - $start;
       echo "Decode: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "\n";
    }
    

    Result:

    Decode: 3407872 | 1573296
    Decode: 3407872 | 1573296
    Decode: 3407872 | 1573296
    Decode: 3932160 | 1573296
    Decode: 4456448 | 1573296
    Decode: 4456448 | 1573296
    Decode: 4980736 | 1573296
    Decode: 4980736 | 1573296
    Decode: 4980736 | 1573296
    Decode: 4980736 | 1573296
    Decode: 4980736 | 244448
    

    Also you can add gc_collect_cycles:

    Code:

    for($y = 0; $y < $nbrComplete; $y++){
       $decoded = json_decode($result['json'][$y], true);
       unset($decoded);
       gc_collect_cycles();
       $time = microtime(true) - $start;
       echo "Decode: ". memory_get_peak_usage(true) . " | " . memory_get_usage() . "\n";
    }
    

    It's can help for you in some cases, but in results it can comes to performance degradation.

    You can try restart script with unset, and unset+gc and write ago if you will have the same issue after changes.

    Also I don't see where you use $decoded variable, if it's error in code, you can remove json_decode :)