javascripttime-complexitycomplexity-theoryspace-complexity

Does overwriting an existing array with a new array cost extra time or memory in the context of complexity analysis?


Suppose we have a function that takes as a parameter a sorted array of integers, and will return a new array containing the numbers that appear in the input array most often; in other words, the function returns the mode of the array:

// Example input: [1, 2, 2, 3, 3, 3, 4]
function getArrayMode(array) {
    let modes = [];
    let currentStreak = 0;
    let bestStreak = 0;
    let previousNumber = null;
    for (const number of array) {
        if (number === previousNumber) currentStreak++;
        else currentStreak = 1;
        if (currentStreak === bestStreak) {
            modes.push(number);
        } else if (currentStreak > bestStreak) {
            modes = [number]; // What impact does this have?
            bestStreak = currentStreak;
        }
        previousNumber = number;
    }
    return modes;
}

I believe the time complexity of this function is O(n), and space complexity O(1) when not considering the output array. But does reassigning a new array modes = [number] actually affect the complexity analysis in Javascript? Would things change if we did count the output array? What is happening behind the scenes when we do this to an array?


Solution

  • does reassigning a new array modes = [number] actually affect the complexity analysis in Javascript? What is happening behind the scenes when we do this to an array?

    The array that modes referenced before this assignment will be orphaned by the assignment and thus subject to garbage collection. When analysing space complexity we may ignore such orphaned memory. We'd have the same space complexity analysis if we had done this instead:

    nodes.length = 0; // Clear the array
    modes.push(number);
    

    We should thus consider the maximum array size that modes has referenced during the execution of this code.

    Would things change if we did count the output array?

    Yes, but it is unusual to not count the output array in the analysis: even though it is output, it still is space that must be allocated by the algorithm, and thus has a memory impact that you want to be aware of. Not taking that memory into account makes little sense.

    Also realise that the memory used by modes may in some cases be more than the memory needed for the output. Take this example input:

    [1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 8, 8, 8]
    

    The algorithm will populate modes to eventually contain [1, 2, 3, 4, 5, 6, 7], but then it is reassigned a new array, [8], which will be the final result.

    The algorithm allocated memory for storing the intermediate array, but needs (much) less for the output. In general there are "best" cases (in terms of output size) where the space complexity of the output is O(1), but the auxiliary space is O(𝑛). For these cases we cannot conclude that the space complexity -- when not considering the output array -- is O(1).

    The standard way to analyse space complexity is to treat the memory allocated for representing the output in no different way than any other memory allocated by the algorithm.

    With that in mind the worst-case space complexity is O(𝑛), while the best-case space complexity is O(1) -- for example, when the frequencies of all input values are different and the input is sorted.