c++time-complexitycomplexity-theory

How to figure out time complexity of complex programs


For example:

#include <iostream>
using namespace std;

int main() {
    int n, s = 0, i, j, k;

    cin >> n;

    for (i = 1; i <= n * n; i++) {
        for (j = 1; j <= i / 2; j++) {
            s += i + j;
        }

        k = 1;
        while (k < j) {
            s += k;
            k += 2;
        }
    }

    cout << s << endl;
    return 0;
}

How do I approach this?

So, the time complexity should be larger than O(n^2), but this is where I get stuck. Any help would be appreciated


Solution

  • Let's compare the first and second inner loops. The code in their bodies represents O(1) time complexity (assuming arithmetic operations have constant time). After the first inner loop has completed, j is equal to i/2+1. As the second loop has k increasing with steps of 2, it will make half the number of iterations of what the first loop did. That means in terms of asymptotic behaviour they represent the same time complexity (a coefficient is not significant), and so we can ignore the second inner loop.

    The first inner loop has this number of iterations:

    Ā  Ā  Ā  0 + 1 + 1 + 2 + 2 + 3 + 3 + ... + (š‘›2āˆ’1)/2 + š‘›2/2

    If š‘› is odd, this is the same as:

    Ā  Ā  Ā  2(0 + 1 + 2 + ... + š‘›2/2)

    If š‘› is even, the final term š‘›2/2 is not equal to its predecessor (š‘›2āˆ’1)/2, so its value only occurs once, not twice like the other values:

    Ā  Ā  Ā  2(0 + 1 + 2 + ... + š‘›2/2) āˆ’ (š‘›2/2)

    The sum in parentheses, i.e. (0 + 1 + 2 + ... + š‘›2/2), is a triangular number:

    Ā  Ā  Ā  (š‘›2/2)(š‘›2/2+1)/2

    When multiplied by 2, the division can be left out. For odd š‘› (and thus odd š‘›2) we get:

    Ā  Ā  Ā  (š‘›2/2)(š‘›2/2+1) = š‘›4/4 + š‘›2/2

    For even š‘› we get:

    Ā  Ā  Ā  š‘›4/4

    In either case the complexity is thus O(š‘›4)