I have an API that I call about at least 10 times at the same time with different information.
This is the function I am currently using.
$mh = curl_multi_init();
$arr = array();
$rows = array();
while ($row = mysqli_fetch_array($query)) {
array_push($arr, initiate_curl($row, $mh));
array_push($rows, $row);
}
$running = null;
for(;;){
curl_multi_exec($mh, $running);
if(!$running){
break;
}
curl_multi_select($mh);
usleep(1);
}
sleep(1);
foreach($arr as $curl) {curl_multi_remove_handle($mh, $curl);}
curl_multi_close($mh);
foreach($arr as $key=>$curl) {
$result = curl_multi_getcontent($curl);
$dat = simplexml_load_string($result);
check_time($dat, $rows[$key], $fp);
}
It works fine when the number of requests is small, but when it grows some of the curls do not bring back the appropriate data. i.e. they return null, and I am guessing because the server goes through before anything is happening..
what can I do to make this work? I am unexperienced in php or server and am having a hard time going through documents..
if I create another php file in which I curl to the API to do stuff with the data, and multi_curl that php file, would it work better? (because in that case it won't be that important that some of the calls do not return the data.. Would that overload my server constantly?
This is not curl issue, this is server utilisation issue so try upgrading server.
Note: Be careful when using infinite loop [for(;;)] and regularly monitor CPU utilisation.