phparraysmultidimensional-arraysplarrayiterator

PHP optimization with class


I have a project in codeigniter but isn't releveant.

I have to search inside many external server cars with some search criteria like (sell zone, price, model...).

I make a query in xml to some external server that response to me with an xml, now I have create a model foreach provider (external server).

Into my controller I pass to each model search criteria and return me an array nested like that:

$cars = array(
   'car' = array('id' => '', 'name' => '', 'year' => '', 'color' => ''...),
   'optional' = array('car_id' => '','key' => ''....),
   'variant' = array('car_id' => '','key' => ''....),
);

Main array contain three array inside with many key and value, each optional refer to a a car with car_id, each variant refer to a car with car_id.
When return me this array for each model I make a merge because the array that I have created is standard like this:

$car_array = array();
$car_provider1 = $this->provider1->getCar();
$car_provider2 = $this->provider2->getCar();
array_merge_recursive($car_array, $car_provider1, $car_provider2);

Now I have a big array standard with all car of my provider, it can be a very large array with three array inside because depends on the search criteria.

Well my question is that: Having a big nested array final like mine can I have a problem of memory or slow speed?

Is better to use an spl class? Like ArrayIterator or something like that? Because I have to pass the final array to the controller and after I have to pass that to my view. I transform each array into a json before print the page because I use backbone to print all.

But my question is: is better to have a big array nested or there is some spl class to manage all this data? (I need only a container to store data, transfomr after into json, nd make some check like: I have an equal car delete it, or an optional that I don't want, something to check or search inside it).

Is a big array a good choice for my goal?


Solution

  • Maybe I don't understand your situation fully but I don't see the need for array_merge_recursive()

    Suppose you have data like this:

    <?php
    
    $carsResponse1 = array(
        array(
            'make' => 'Ford',
            'price' => 3500
        ),
        array(
            'make' => 'GM',
            'price' => 4900
        )
    );
    
    $carsResponse2 = array(
        array(
            'make' => 'Acura',
            'price' => 2300
        ),
        array(
            'make' => 'Toyota',
            'price' => 3900
        )
    );
    
    $combined = array_merge($carsResponse1, $carsResponse2);
    
    print_r($combined);
    

    Now, running that yields:

    Array
    (
        [0] => Array
            (
                [make] => Ford
                [price] => 3500
            )
    
        [1] => Array
            (
                [make] => GM
                [price] => 4900
            )
    
        [2] => Array
            (
                [make] => Acura
                [price] => 2300
            )
    
        [3] => Array
            (
                [make] => Toyota
                [price] => 3900
            )
    
    )
    

    => non-recursive version of array merge should fit your needs.

    As for the answer itself:

    I don't see a problem with using arrays or multiple dimensions. Using ArrayIterator interface means only that objects of a class implementing that interface can use PHP's array syntax and iteration features to work with data in that class.

    Whatever that class would do internally, would most certainly have to be implemented using PHP's arrays.

    Using "plain" arrays wisely most likely performs CPU- and RAM-wise better than any fancier over-engineered custom-built solution.

    If the data set you are prosessing is not massive (thousands to tens of thousands of records), I don't see a problem storing them in an array, if you truly have to process them all at once.

    If it becomes a problem, you could look into pagination, so you are only processing a small amount of data at a time.

    Actually designing your service from the start for pagination would be the best thing to do, instead of processing all the data at once, but since you didn't mention it I'm assuming all your external services can't handle pagination.