Skip to main content

Time Complexity

Table

info

A conservative estimate for the number of operations the grading server can handle per 1 second is 108, but it could be closer to 5 * 108 given good constant factors.

nPossible complexities (1 second)
n ≤ 10O(n!) , O(n7) , O(n 6)
n ≤ 20O(2n * n) , O(n 5)
n ≤ 80O(n4)
n ≤ 400O(n3)
n ≤ 7500O(n2)
n ≤ 7 * 104O(n √n)
n ≤ 7 * 105O(n * log n)
n ≤ 7 * 106O(n)
n ≤ 7 * 1018O(log2 n) , O(log n) , O(1)

Finding Time Complexity from CONSTRAINTS

if given time contraint is 106 and its inside another loop of 103 ; time complexity becomes 109. and this 109 > 108 as it should be for 1 second. So 109 means 10 * 108 i.e. 10 * 1second. So, 10 soconds will be required to run that test case, giving us TLE error.

Thus, from here we understand even if we have been given more than 1 second time constraint we MUST always solve the problem according to 1 second time contraint, because change in one power of 10 makes it solvable for 10_seconds, So it will not be possible to solve in anytime better than 1 second .


caution

The big O notation means "growth no faster than ..." and DOESN'T provide you a tight bound.


Different Time Complexities

note

Below are, from best to worse, some common values of the big O function for the time complexity.

  • O(1) (constant time). The time does not depend on the input size. Examples: calculating the sum of an arithmetic progression using the formula, printing a single value.
  • O(log n) (logarithmic time). The time is proportional to the logarithm of the input size (the base of the logarithm does not matter). An example: binary search in a sorted array.
  • O(√n) (square root time). The time is proportional to the square root of the input size.
  • O(n) (linear time). The time is proportional to the input size, i.e., time grows linearly as the input size increases. Often, such algorithms are iterated only once. Examples: sequential search, finding the maximum/minimum in an array.
  • O(nlog n) (log-linear time). The time is proportional to n logn. Examples: efficient sorting algorithms like heapsort.
  • O(n^2) (quadratic time). The time is proportional to the squared input size. Examples: simple sorting algorithms, such as bubble sort, selection sort, or insertion sort.
  • O(2^n) (exponential time). The number of required operations depends exponentially on the input size.
  • O(n!) (factorial time) The time is proportional to n!. An example: a brute force algorithm like Bogosort.