Everything You Need To Know About Time Complexity

--

What are Algorithms?

We have come across this short and crisp word from our high school and everyone seems to think very highly of this term. What does it mean?

An algorithm refers to a set of rules/instructions that step-by-step define how a work is to be done in order to get the expected results. In simpler terms, it will give us a breakdown of the necessary processes that help us achieve a given objective.

The figure below illustrates what an algorithm is and how it represents by using a simple example of the sum of two numbers.

How to Design an Algorithm?

To learn how to write algorithms we need to pay attention to a few crucial things:

  1. The problem that is to be solved by this algorithm.
  2. The constraints of the problem that must be considered while solving the problem.
  3. The input to be taken to solve the problem.
  4. The output to be expected when the problem is solved.
  5. The solution to this problem.

What is Time complexity?

By definition, the time complexity is the amount of time taken by an algorithm to run, as a function of the length the input was. Here, the length of input denotes the number of functions to be performed by the algorithm. Time complexity does not give the total execution time of an algorithm. Rather, it is going to give information about the variation (increase or decrease) in execution time when the number of operations (increase or decrease) in an algorithm.

Now as the definition itself states, time complexity is a function of the size of the input.

Types of Time complexity notation

There exists a relation between the input data size (n) and the number of operations performed (N) with respect to time. This relation is known as the Order of growth in Time complexity and is given the notation O[n] where;

O is the order of growth and n is the length of the input. It is also called as ‘Big O Notation’

For example, a process requiring n² steps and a process requiring 1000n² steps, and a process requiring 5+7n+17 steps all have O(n²) order of growth. Order of growth provides a useful indication of how we may expect the behavior of the process to change as we change the size of the problem. Depending on the algorithm, the behaviour of the procedure changes.

Now, a valid question here would be, what is O(n²). Before we delve into that we must understand the types of notation used in calculating time complexity.

  1. Constant Time — O(1)
  2. Linear Time — O(n)
  3. Logarithmic Time — O(log n)
  4. Quadratic Time — O(n²)
  5. Cubic Time — O(n³)
  6. Linearithmic Time — O(n log n)
  7. Exponential Time — O(2^n)
  8. Factorial Time — O(n!)

We shall discuss a few types in further detail;

1. O(1): Constant Time

The execution time of these algorithms is independent of the size of the input. A good example of O(1) time is accessing a value with an array index.

CODE

Another simple example would be to find whether a number is even or odd,

CODE

OUTPUT

Enter an integer: 10

10 is even.

2. O(n): Linear Time

An algorithm is said to have a linear time complexity when the running time increases linearly with the length of the input. When the function involves checking all the values in input data, it is said to be running with linear time.

Examples of O(n) algorithm,

Getting max/min value in an array

CODE

OUTPUT

Enter the number of elements (1 to 100): 5

Enter number1: 34.5

Enter number2: 2.4

Enter number3: -35.5

Enter number4: 38.7

Enter number5: 24.5

Largest element = 38.70

The time complexity is equal to the number of elements, therefore it runs for linear time and it is O(n)

3. O(log n): Logarithmic Time

An algorithm is said to have a logarithmic time complexity when it reduces the size of the input data in each step. This indicates that the number of operations is not the same as the input size. The number of operations gets reduced as the input size increases. Algorithms with Logarithmic time complexity are found in binary trees or binary search functions

Examples of Logarithmic time complexity:

Finding a word in a dictionary or phonebook.

There are two ways by which we can search for a phone number in a directory. We can either start from the beginning and have a time complexity of O(n) or, we can start at the middle and if the first alphabet of the name was greater than the middle value we move in the second half otherwise we move towards the first half of the names. This principle is followed in the Binary Search Algorithm.

CODE

OUTPUT

4. Quadratic time — O (n²)

An algorithm is said to have a non — linear time complexity where the running time increases non-linearly (n²) with the length of the input. Generally, nested loops come under this time complexity order where for one loop takes O(n) and if the function involves a loop within a loop, then it goes for O(n)*O(n) = O(n²) order.

Here are some examples with O(n²) time complexity: Sorting elements in an array using Bubble Sort Algorithm, Insertion, and Selection Sort.

CODE

When a function has a single loop, it usually has a running time complexity of O(n). Now, this function has 2 nested loops and quadratic running time: O(n²).

5. O(n log n): Linearithmic Time

Linearithmic time complexity is said to be slower than a linear algorithm but much faster than a quadratic algorithm. For example: Merge Sort

The time complexity of MergeSort is O(n*Log n) as the mergesort always divides the array into two halves and takes linear time to merge the two halves. But dividing the original array into 2 smaller subarrays is not helping us in sorting the array. So we will break these subarrays into even smaller subarrays until we have multiple subarrays with a single element in them. Now, the idea here is that an array with a single element is already sorted, so once we break the original array into subarrays that have only a single element, we have successfully broken down our problem into base problems.

And then we have to merge all these sorted subarrays, step by step to form one single sorted array.

CODE

6. O(2^n) — Exponential time

Exponential Time complexity refers to an algorithm whose growth doubles with each addition to the input data set. Set of Fibonacci numbers, the power set of a set are a few algorithms that follow exponential time.

Also, Algorithms with running time O(2^n) are often recursive algorithms that solve a problem of size n by recursively solving two smaller problems of size n-1.

The program below prints out all the moves necessary to solve the famous “Towers of Hanoi” problem for N disks in pseudo-code.

Time complexity analysis for the above code:

On the repeated expansion of the last term we get,

Finally, we discuss the Factorial time complexity,

7. O(n!)-Factorial time complexity

Factorial time complexity is the worst time complexity among all types. For each input data, the time increases at a larger and quicker rate. This is found in permutations of a word or the salesman problem using brute force.

CODE

CONCLUSION

The chart given below illustrates the various time complexities we can come across along with examples that illustrate the same.

Summary of all the time complexity notations

--

--

Data Analytics Club, VIT Chennai

Curious about developments in Data Science? You are at the right place! Follow us at our LinkedIn page : https://www.linkedin.com/company/dac-vit-chennai/