phpwordpressadvanced-custom-fieldsrepeater

Wordpress + ACF + programmatically adding a large amount of data


I have Wordpress + company post type + ACF repeater field for each company + data array in $data. The code below checks the number of rows in the repeater field and compares them with the number of rows in the array. If there are fewer or more rows in the repeater than there are rows in the array, then we add or remove them. And then we update all these fields by adding data from $data to them.

This code works correctly for me, with one exception - there is a lot of data and it simply does not have time to do it for all companies (the update occurs once a day). Makes 5-6 first companies from the list and goes into timeout. I'm thinking of getting around this by adding data directly to mysql, but I can't solve it in any way. My current code is below:

// Getting the current number of fields in the ACF repeater
$com_array_data = get_field('com_array_data', $company_id);
if (is_array($com_array_data) || $com_array_data instanceof Countable) {
    $count = count($com_array_data);
} else {
    $count = 0;
}
// Getting the number of data rows in array
$data_count = count($data['candles']['data']);

// If the number of rows in the repeater is less than the number of rows in array, add the missing ones to the repeater
if ($count < $data_count) {
    for ($i = $count; $i < $data_count; $i++) {
        add_row('com_array_data', array(
            'open' => '',
            'close' => '',
            'high' => '',
            'low' => '',
            'value' => '',
            'volume' => '',
            'begin' => '',
            'end' => ''
        ), $company_id);
    }
}

// If the number of rows of fields in the repeater is greater than the number of rows in array, we delete the extra ones in the repeater
if ($count > $data_count) {
    for ($i = $count; $i > $data_count; $i--) {
        delete_row('com_array_data', $i, $company_id);
    }
}

// Updating the field values in the ACF repeater
if (have_rows($com_array_data)) {
    while (have_rows($com_array_data)) {
        the_row();
        $row_index = get_row_index();
        update_sub_field('open', $data['candles']['data'][$row_index - 1][0], $company_id);
        update_sub_field('close', $data['candles']['data'][$row_index - 1][1], $company_id);
        update_sub_field('high', $data['candles']['data'][$row_index - 1][2], $company_id);
        update_sub_field('low', $data['candles']['data'][$row_index - 1][3], $company_id);
        update_sub_field('value', $data['candles']['data'][$row_index - 1][4], $company_id);
        update_sub_field('volume', $data['candles']['data'][$row_index - 1][5], $company_id);
        update_sub_field('begin', $data['candles']['data'][$row_index - 1][6], $company_id);
        update_sub_field('end', $data['candles']['data'][$row_index - 1][7], $company_id);
    }
}

I managed to improve it a little by changing the part in adding data and it seems now there is no timeout, but it's still not the best option and it doesn't work very fast, I think it can still be improved.

$values = array();
foreach ($data['candles']['data'] as $row) {
    $values[] = array(
        'open' => $row[0],
        'close' => $row[1],
        'high' => $row[2],
        'low' => $row[3],
        'value' => $row[4],
        'volume' => $row[5],
        'begin' => $row[6],
        'end' => $row[7]
    );
}

if ($count > 0) {
    update_field('com_array_data', $values, $company_id);
} else {
    add_row('com_array_data', $values, $company_id);
}

I would be grateful for a review of the code and options for improving it, primarily to improve performance.

upd: And yet the second option also goes into timeout. «PHP Fatal error: Maximum execution time of 320 seconds exceeded ... wp-includes/class-wpdb.php»


Solution

  • The best optimization you could do is to run several processes in parallel. Divide the companies into groups and then run the code for each group concurrently.

    Your code looks pretty clean, but one potential improvement I see is to limit the number of subscripting operations. Instead of this:

    update_sub_field('open', $data['candles']['data'][$row_index - 1][0], $company_id);
    ...
    

    do this:

    $row_data = $data['candles']['data'];
    ...
    update_sub_field('open', $row_data[$row_index - 1][0], $company_id);
    ...
    

    There are alot of automatic optimizations in the recent versions of PHP, so this might not make any difference, but worth a shot.

    Speaking of which, are you running the latest version of PHP? If not, upgrading might make everything run a bit faster.

    If all else fails, you could increase the timeout on the server...