What is time complexity?

When creating a computer program, it is important to consider the amount of time taken up by the algorithms you write in order to save computing time and power and make efficient programs. The time required to perform an algorithm is its time complexity. Time complexity is described by the use of Big O notation, where input size is defined by n, while O represents the worst case scenario growth rate. Though there are many types of time complexities, in this post, I will go through the most commonly seen types:

  • Constant —O(1)
  • Linear — O(n)
  • Logarithmic — O(log n)
  • Linearithmic — (n log n)
  • Quadratic — O(n²)
  • Polynomial — O(n^c)
  • Exponential — O(2^n)
  • Factorial — O(n!)

Constant Time — O(1)

Red line denotes Constant O(1) time complexity

Constant time is denoted by O(1), and takes the same time to compute despite the size of an input n. This means that if n is 5 or 7,000, the time to process the algorithms will be the same.

Examples: finding if a number is even or odd, printing the first item from a list, checking if an item on an array is equal to a certain value

Function that prints the first element of n to the console

Linear Time — O(n)

Linear time complexity occurs when as the input n increases in size, the time for the algorithm to process also increases at a proportionate rate. In our example below, we will find the smallest number in a sorted array.

Example:

Finding the smallest element in a sorted array

Logarithmic — O(log n)

Logarithmic time complexity is the result of when input n is reduced in size at each step of the algorithm. You can see that while the size of n is small, the O increases steeply, but as the n size is reduced (e.g., if it is halved at each iteration of a loop), the curve flattens and becomes less and less steep as n increases.

Examples:

finding the log of n, finding the index of an element in a sorted array with a binary search

Linearithmic Time — O(n log n)

Linearithmic time complexity denoted by the purple line

Linearithmic time complexity, denoted by the purple line in the graph below, as you can see, is almost linear. However, it is slightly more efficient than linear at first. Algorithms that create a linearithmic time complexity pattern have a growth rate of (n log n).

Examples:

sorting elements in an array using a merge sort

Quadratic Time — O(n²)

A quadratic time complexity pattern is created when the growth rate of n is n². This effect is often created when there are nested for loops. In the example below, the for loop contains an if statement that checks the indexOf items in an array. Since the indexOf method inherently implements a loop as per its construction, the example below is essentially a nested for loop. When determining time complexity, therefore, remember that higher order functions also inherently implement loops and don’t just check to see if two for loops are present.

Example:

finding duplicate elements in an array using a for loop and indexOf

Function finding whether there are duplicate element in an array

Polynomial — O(n^c)

While quadratic time falls under the umbrella of polynomial in that its c value is 2, polynomial time complexity refers to any algorithm for which n increases by a rate of n^c. In the example below, we will consider the cubic time complexity — O(n³), as it is more common than n to any higher power. The example below contains a triple nested loop.

Example:

3 variable equation solver — triple nested for loops

Exponential Time — O(2^n)

Algorithms that create an exponential time complexity pattern increase n at a rate of 2^n. Many examples I found involve recursive functions, so keep an eye out for recursion when you are determining time complexity patterns.

Example:

Using recursion to generate the nth number in a Fibonacci sequence, finding all subsets in a set

Factorial Time — O(n!)

Algorithms that create a factorial time complexity pattern increase n at a rate of n!. A factorial is the product of all integers less than that number (e.g., 5! would be 5*4*3*2*1).

Example:

finding the factorial of n, find all permutations of a given set/string

Conclusion

Time complexity is important to consider when working as a software engineer. How you build your algorithms heavily impacts the processing time needed for your program. In the graph below, each time complexity we discussed is laid out from Horrible to Excellent in terms of processing time.

References

8 time complexities that every programmer should know
SummaryLearn how to compare algorithms and develop code that scales! In this post, we cover 8 big o notations and provide an example or 2 for each. We are going to learn the top algorithm’s running time that every developer should be familiar with. Knowing these time complexities will help you to as…

https://www.bigocheatsheet.com/

Understanding Big-O Notation With JavaScript
Big-0 Notation Primer O(1) is holy. ~Hamid Tizhoosh The Big-O notation measures the wor...

https://en.wikipedia.org/wiki/Time_complexity