Basics of Time Complexity Basics of Time Complexity

Page content

In this tutorial, we’ll learn how to calculate time complexity of a function execution with examples.

Time Complexity

Time complexity is generally represented by big-oh notation 𝘖.
If time complexity of a function is 𝘖(n), that means function will take n unit of time to execute.

These are the general types of time complexity which you come across after the calculation:-

TypeName
𝘖 (1)Constant
𝘖 (log₂n)Logarithmic
𝘖 (√nRoot
𝘖 (n)Linear
𝘖 (n²)Quadratic
𝘖 (n³)Cubic
𝘖 (2ⁿ)Exponential
𝘖 (nⁿ)Exponential

Time Complexity in the increasing order of their value:-

1 < log₂n < √n < n < nlog₂n < n² < n³ ... < 2ⁿ < 3ⁿ ... < nⁿ

Time Complexity Calculation

We are going to understand time complexity with loads of examples:-

for loop

Let’s look at the time complexity of for loop with many examples, which are easier to calculate:-

Example 1
for(int i=0; i<n; i++){}
loop run `n` times 
hence Time Complexity = 𝘖(n)

Example 2
for(int i=0; i<n; i=i+2){}
loop run `n/2` times which is still linear
hence Time Complexity = 𝘖(n)

Example 3
for(int i=n; i>1; i--){}
loop run `n` times 
hence Time Complexity = 𝘖(n)

Example 4
for(int i=1; i<n; i++){}

for(int j=1; j<n; j++){}
two individual loops run `n+n = 2n` times which is still linear
hence Time Complexity = 𝘖(n)

Example 5
for(int i=0; i<n; i++){
    for(int j=0; j<n; j++){}  
}
Nested loop run `nxn = n²` times which is non-linear
hence Time Complexity = 𝘖(n²)

Example 6
for(int i=1; i<n; i=i*2){}
Calculation:
―――――――――――――――――――――――――――― values of `i` in `for` loop ―――――――――――――――――――――――――――― 1 2 2² 2³ . . 2ᵏ ―――――――――――――――――――――――――――― The loop will terminate when:- i ≥ n 2ᵏ ≥ n k ≥ log₂(n) so log₂(n) is total unit of time taken by the loop hence Time complexity = 𝘖(log₂(n))

Example 7
for(int i=1; i<n; i=i*3){}
Calculation:
Similar to the last example, value of `i` in `for` loop is i = 3ᵏ The loop will terminate when:- i ≥ n 3ᵏ ≥ n k ≥ log₃n so log₃n is total unit of time taken by the loop hence Time complexity = 𝘖(log₃n)

Example 8
for(int i=n; i>1; i=i/2){}
Calculation:
―――――――――――――――――――――――――――― values of `i` in `for` loop ―――――――――――――――――――――――――――― n n/2 n/2² n/2³ . . n/2ᵏ ―――――――――――――――――――――――――――― The loop will terminate when:- i ⋜ 1 n/2ᵏ ⋜ 1 2ᵏ ≥ n k ≥ log₂(n) so log₂(n) is total unit of time taken by the loop hence Time complexity = 𝘖(log₂(n))

Example 9
for(int i=0; i<n; i++){
    for(int j=1; j<n; j=j*2){}
}
Calculation:
outer loop complexity = 𝘖(n) inner loop complexity = 𝘖(log₂(n)) hence Time complexity = 𝘖(nlog₂(n))

Example 10
int p = 0;
for(int i=1; p<=n; i++){
    p = p + i;
}
Calculation:
value of `i` and `p` in `for` loop:- ――――――――――――――――――― i p ――――――――――――――――――― 1 0+1=1 2 1+2=3 3 1+2+3=4 4 1+2+3+4 . . . . k 1+2+3+4+...+k k k(k+1)/2 ――――――――――――――――――― The loop will terminate when:- p > n k(k+1)/2 > n k² > n k > √n so √n is total unit of time taken by the loop hence Time complexity = 𝘖(√n)

Example 11
int p = 0;
for(int i=1; i<n; i=i*2){
    p++;
}
for(int j=1; j<p; j=j*2){
    // statement
}
Calculation:
1. first `for` loop will take log₂(n) unit of time to execute. At the end of first loop value of p = log₂(n) 2. second `for` loop will take log₂(p) unit of time to execute. hence Time Complexity = 𝘖(log₂(p)) = 𝘖(log₂log₂(n))

while loop

If you understand how to calculate the time complexity of for loop then while loop is piece of cake.

Example 1
int i=1;
while(i<n){
    //statement
    i=i*2;
}
Time Complexity = 𝘖(log₂(n))

Example 2
int i=n;
while(i>1){
    //statement
    i=i/2;
}
Time Complexity = 𝘖(log₂(n))

Example 3
int i=1;
int k=1;
while(k<n){
    //statement
    k=k+i;
    i++;
}
Time Complexity = 𝘖(√n)

Variable Time Complexity

It is not necessary that function always take fixed unit of time to execute, sometime it depends on the input parameters. Here are some examples where time complexity is not fixed:-

Example 1
method(n, m){
    while(n!=m){
        if(n>m){
            n = n-m;
        }else{
            m = m-n;
        }
    }
}
best case time = 𝘖(1)
(when n = m)

worst case time = 𝘖(n)
(when n is very larger then m (e.g. n=16, m=1))

Example 2
method(n){
    if(n<5){
        //statement
    }else{
        for(i=0;i<n;i++){
            //statement
        }
    }
}
best case time = 𝘖(1)
(when n < 5)

worst case time = 𝘖(n)
(when n ≥ 5)

Recursive Functions

Let’s see the time complexity of recurring (or recursive) functions:-

Example 1
test(int n){
    if(n>0){
        //statement
        test(n-1);
    }
}
// T(n) = T(n-1) + 1 = 𝘖(n)
Calculation:
Base case T(0) = 1 Time taken by nth task is time taken by (n-1)th task plus 1 T(n) = T(n-1) + 1 --(1) Similarly time taken by (n-1)th task is (n-2)th task plus 1 T(n-1) = T(n-2) + 1 --(2) T(n-2) = T(n-3) + 1 --(3) T(n) = T(n-3) + 3 --after substituting (2),(3) in (1) T(n) = T(n-k) + n ... Assume (n-k)th is 0th task means n=k T(n) = T(0) + n T(n) = 1 + n ⋍ n hence Time Complexity = 𝘖(n)

Example 2
test(int n){
    if(n>0){
        for(int i=0; i<n; i++){
            //statement
        }
        test(n-1);
    }
}
// T(n) = T(n-1) + n = 𝘖(n²)
Calculation:
Base case T(0) = 1 Time taken by nth task:- T(n) = T(n-1) + n T(n-1) = T(n-2) + n-1 T(n-2) = T(n-3) + n-2 .. .. T(n) = T(n-3) + (n-2) + (n-1) + n T(n) = T(n-k) + (n-(k-1)) + (n-(k-2)) + ... + (n-1) + n Assume (n-k)th is 0th task means n=k T(n) = T(n-n) + (n-n+1) + (n-n+2) + ... + (n-1) + n T(n) = T(0) + 1 + 2 + 3 + ... + n T(n) = 1 + n(n+1)/2 ⋍ n² hence Time Complexity = 𝘖(n²)

Example 3
test(int n){
    if(n>0){
        for(int i=0; i<n; i=i*2){
            //statement
        }
        test(n-1);
    }
}
// T(n) = T(n-1) + log₂(n) = 𝘖(nlog₂(n))
Calculation:
Base case T(0) = 1 Time taken by nth task:- T(n) = T(n-1) + log₂(n) T(n-1) = T(n-2) + log₂(n-1) T(n-2) = T(n-3) + log₂(n-2) .. .. T(n) = T(n-3) + log₂(n-2) + log₂(n-1) + log₂(n) T(n) = T(n-k) + log₂1 + log₂2 + ... + log₂(n-1) + log₂(n) Assume (n-k)th is 0th task means n=k T(n) = T(0) + log₂(n)! T(n) = 1 + nlog₂(n) ⋍ nlog₂(n) hence Time Complexity = 𝘖(nlog₂(n))

Example 4
test(int n){
    if(n>0){
        //statement
        test(n-1);
        test(n-1);
    }
}
// T(n) = 2T(n-1) + 1 = 𝘖(2ⁿ)
Calculation:
Base case T(0) = 1 Time taken by nth task:- T(n) = 2T(n-1) + 1 T(n-1) = 2T(n-2) + 1 T(n-2) = 2T(n-3) + 1 .. .. T(n) = 2[2[2T(n-3) + 1] + 1] + 1 T(n) = 2³T(n-3) + 2² + 2 + 1 T(n) = 2ᵏT(n-k) + 2ᵏ⁻¹ + 2ᵏ⁻² + .. + 2² + 2 + 1 Assume (n-k)th is 0th task means n=k T(n) = 2ⁿT(0) + (2ⁿ-1) T(n) = 2ⁿ + (2ⁿ-1) ⋍ 2ⁿ hence Time Complexity = 𝘖(2ⁿ)

Example 5
test(int n){
    if(n>0){
        //statement
        test(n-1);
        test(n-1);
        test(n-1);
    }
}
T(n) = 3T(n-1) + 1
Time Complexity = 𝘖(3ⁿ)

Example 6
test(int n){
    if(n>0){
        for(int i=0; i<n; i++){
            //statement
        }
        test(n-1);
        test(n-1);
    }
}
// 
T(n) = 2T(n-1) + n
Time Complexity = 𝘖(n2ⁿ)

Example 7
test(int n){
    if(n>0){
        //statement
        test(n/2);
    }
}
Base case 
T(1) = 1

T(n) = T(n/2) + 1
T(n/2) = T(n/2²) + 1
T(n) = [T(n/2²) + 1] + 1
T(n) = T(n/2ᵏ) + k

Assume (n/2ᵏ)th is last task means
n/2ᵏ = 1
2ᵏ = n
k = log₂(n)
T(n) = T(1) + log₂(n) = 1 + log₂(n) ⋍ log₂(n)

hence Time Complexity = 𝘖(log₂(n))

Example 8
test(int n){
    if(n>0){
        for(int i=0; i<n; i++){
            //statement
        }
        test(n/2);
    }
}
Base case 
T(1) = 1

T(n) = T(n/2) + 1
T(n/2) = T(n/2²) + 1
T(n) = [T(n/2²) + 1] + 1
T(n) = T(n/2ᵏ) + n/2ᵏ⁻¹ + n/2ᵏ⁻² + .. +  n/2² + n/2 + n

Assume (n/2ᵏ)th is last task means
n/2ᵏ = 1
2ᵏ = n
k = log₂(n)
T(n) = T(1) + n[1/2ᵏ⁻¹ + 1/2ᵏ⁻² + .. + 1/2 + 1]
T(n) = 1 + n[1 + 1] = 1 + 2n ⋍ n

hence Time Complexity = 𝘖(n)

Example 9
test(int n){
    if(n>0){
        //statement
        test(n/2);
        test(n/2);
    }
}
T(n) = 2T(n/2) + 1
T(n) = 2ᵏT(n/2ᵏ) + k + k-1 + k-2 + ... + 1

Assume n/2ᵏ = 1 means k = log₂(n)
T(n) = nT(1) + k(k+1)/2 ⋍ n + (log₂(n))² ⋍ n

hence Time Complexity = 𝘖(n)

Example 10
test(int n){
    if(n>0){
        for(int i=0; i<n; i++){
            //statement
        }
        test(n/2);
        test(n/2);
    }
}
T(n) = 2T(n/2) + n
Time Complexity = 𝘖(nlog₂(n))

Asymptotic Notations

We can represent the function complexity in following ways:-

SymbolNameBound
𝘖big-ohupper bound
big-omegalower bound
big-thetaaverage bound

Example 1
For e.g.  f(n) = 2n + 3  
can be represented as
𝘖(n) or any notation with higher weightage such as 𝘖(nlog₂n) or 𝘖(n²) or 𝘖(n³) ...
Ω(n) or any notation with lower weightage such as Ω(√n), Ω(log₂n), Ω(1) ...
⍬(n) and only ⍬(n) since this is average bound

Ideally you represent the function complexity to nearest type of complexity so in above case 𝘖(n), Ω(n), ⍬(n) are best representations. 
Example 2
f(n) = 2n² + 3n + 4   
1n² ≤ 2n² + 3n + 4 ≤ 9n²  
can be represented as 𝘖(n²), Ω(n²), or ⍬(n²)
Example 3
f(n) = n²log₂n + n    
n²log₂n ≤ 2n²log₂n + n ≤ 3n²log₂n   
can be represented as 𝘖(n²log₂n), Ω(n²log₂n), or ⍬(n²log₂n)
Example 4
f(n) = !n = nx(n-1)x(n-2)x ...x2x1 = nⁿ  
n ≤ !n ≤ nⁿ  
can be represented as 𝘖(nⁿ) upper-bound, Ω(n) lower-bound
can not be represented as ⍬ since there is no common average-bound.
Example 5
f(n) = log!n = log(nx(n-1)x(n-2)x ...x2x1) = log(nⁿ) = nlog(n)  
1 ≤ log!n ≤ nlog(n) 
can be represented as 𝘖(nlog(n)) upper-bound, Ω(1) lower-bound
can not be represented as ⍬ since there is no common average-bound.
  • It is always preferable to represent complexity in big-theta ⍬, if possible, which is more accurate and tight bound.
  • Big-oh 𝘖(n) is the most popular notation to represent function complexity which you come across.

Note: Do not mix these notations with best case, worst case, or average case time complexity. All type of cases can be represented by 𝘖, Ω, and ⍬ notations.