Time complexity is a key concept in computer science, particularly in data structures and algorithms(DSA). It measures how long it takes to run an algorithm or complete a task, and it is usually expressed in terms of the input size.
What Is Space Complexity?
Space complexity is a measure of the amount of memory or space required by an algorithm to solve a problem as a function of the input size, in computer science.
The time complexity of an algorithm can be affected by several factors, including:
Other factors that can affect the time complexity of an algorithm include the use of recursion, the efficiency of the implementation, and the presence of nested loops or nested recursive calls.
The performance of algorithms is examined using a variety of time complexity measures.
Type | Description | Example |
Constant | Regardless of the size of the input, the algorithm needs the same amount of time to complete the task. | O(1) |
Linear | The algorithm’s processing time grows linearly with the size of the input. | O(n) |
Logarithmic | The input size has a logarithmic effect on how long the algorithm takes to solve the problem. | O(log n) |
Polynomial | With each increase in input size to the power of a certain constant k, the algorithm’s time to solve the problem grows. | O(n ^ k) |
Exponential | With increasing input size, the algorithm’s processing time grows exponentially. | O(2 ^ n) |
Factorial | The algorithm’s time to solve the problem grows as the factorial of the input size grows. | O(n!) |
Asymptotic notations are a class of mathematical notations used in computer science and mathematics to describe the behavior of functions as their inputs grow arbitrarily large. It is also referred to as an algorithm’s growth rate.
Note: We compare space and time complexity using asymptotic analysis.
Asymptotic notations are classified into three types:
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }
Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }
Θ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1 * g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0}
Let us take an example to see how one can conclude the time complexity of a program:
Consider sum of all elements of a matrix:
#include <iostream>
using namespace std;
int main()
{
int n = 3;
int m = 3;
int arr[][3]
= { { 1, 2, 4 }, { 5, 6, 7 }, { 3, 0, 8 } };
int sum = 0;
// Iterating over all 1-D arrays in 2-D array
for (int i = 0; i < n; i++) {
// Printing all elements in ith 1-D array
for (int j = 0; j < m; j++) {
// Printing jth element of ith row
sum += arr[i][j];
}
}
cout << sum << endl;
return 0;
}
Using two nested loops, the program iterates through all the elements in the 2D array. For each iteration of the outer loop, the inner loop iterates n times and the outer loop iterates m times. As a result, the program’s time complexity is O(n*m).
Note: also read about SQL: Sequence
Please follow me to read my latest post on programming and technology if you like my post.
https://www.instagram.com/coderz.py/
https://www.facebook.com/coderz.py
Staying up to the mark is what defines me. Hi all! I’m Rabecca Fatima a keen learner, great enthusiast, ready to take new challenges as stepping stones towards flying colors.
Problem Statement: Given n pairs of parentheses, write a function to generate all combinations of well-formed parentheses. Example…
Given an integer A. Compute and return the square root of A. If A is…
Given a zero-based permutation nums (0-indexed), build an array ans of the same length where…
A heap is a specialized tree-based data structure that satisfies the heap property. It is…
What is the Lowest Common Ancestor? In a tree, the lowest common ancestor (LCA) of…