Introduction to Algorithm Complexity
Algorithm complexity is a measure of the efficiency of an algorithm in terms of time and space
consumption. It helps in understanding the scalability and performance of algorithms as input size
increases.
Types of Complexity
1. Time Complexity: Measures the time required for an algorithm to execute.
2. Space Complexity: Measures the memory required for an algorithm to execute.
Time Complexity
What is Time Complexity?
Time Complexity is a way to express how the runtime of an algorithm changes with respect to the
input size. It helps us evaluate the efficiency of an algorithm in terms of how the number of
operations grows as the input grows.
Time complexity is the measure of the time an algorithm takes to run as a function of the input size.
It's usually expressed using Big-O notation.
Comparing Time Complexities:
• Best Case: The most optimal scenario where the algorithm performs the least work (e.g., an
already sorted array for sorting algorithms).
• Worst Case: The scenario where the algorithm performs the most work (e.g., finding an
element at the last position in a list).
• Average Case: The expected runtime for a typical input.
Common Time Complexities:
• O(1) – Constant Time: The algorithm runs in constant time, regardless of the input size.
• O(log n) – Logarithmic Time: The algorithm runs in time proportional to the logarithm of the
input size.
• O(n) – Linear Time: The algorithm runs in time proportional to the input size.
• O(n log n) – Log-Linear Time: This is common in efficient sorting algorithms like Merge Sort
and Quick Sort.
• O(n^2) – Quadratic Time: The algorithm's running time is proportional to the square of the
input size. This is common in algorithms like Bubble Sort and Selection Sort.
• O(2^n) – Exponential Time: The algorithm takes exponential time, often seen in algorithms
that solve combinatorial problems.
• O(n!) – Factorial Time: The algorithm takes factorial time, usually seen in solving problems
like the Traveling Salesman Problem (TSP).
Asymptotic Notations
Asymptotic notations describe the limiting behavior of an algorithm as the input size approaches
infinity. The most common notations are:
1. Big O Notation (O) - Worst-case complexity
2. Theta Notation (Θ) - Average-case complexity
3. Omega Notation (Ω) - Best-case complexity
Common Time Complexities
O(1) - Constant Time Complexity
Example: Accessing an element in an array.
int getElement(int arr[], int index) {
return arr[index];
}
O(log n) - Logarithmic Time Complexity
Example: Binary Search
int binarySearch(int arr[], int left, int right, int key) {
while (left <= right) {
int mid = left + (right - left) / 2;
if (arr[mid] == key) return mid;
if (arr[mid] < key) left = mid + 1;
else right = mid - 1;
}
return -1;
}
O(n) - Linear Time Complexity
Example: Linear Search
int linearSearch(int arr[], int n, int key) {
for (int i = 0; i < n; i++) {
if (arr[i] == key) return i;
}
return -1;
}
O(n log n) - Linearithmic Time Complexity
Example: Merge Sort
void merge(int arr[], int l, int m, int r) {
int i, j, k;
int n1 = m - l + 1;
int n2 = r - m;
int L[n1], R[n2];
for (i = 0; i < n1; i++) L[i] = arr[l + i];
for (j = 0; j < n2; j++) R[j] = arr[m + 1 + j];
i = j = 0; k = l;
while (i < n1 && j < n2) {
arr[k++] = (L[i] <= R[j]) ? L[i++] : R[j++];
}
while (i < n1) arr[k++] = L[i++];
while (j < n2) arr[k++] = R[j++];
}
void mergeSort(int arr[], int l, int r) {
if (l < r) {
int m = l + (r - l) / 2;
mergeSort(arr, l, m);
mergeSort(arr, m + 1, r);
merge(arr, l, m, r);
}
}
O(n2) - Quadratic Time Complexity
Example: Bubble Sort
void bubbleSort(int arr[], int n) {
for (int i = 0; i < n - 1; i++) {
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
}
}
}
}
O(2^n) - Exponential Time Complexity
Example: Fibonacci using Recursion
int fibonacci(int n) {
if (n <= 1) return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}
O(n!) - Factorial Time Complexity
o Example: Traveling Salesman Problem using Brute Force.
Space Complexity
Space complexity determines the amount of memory required by an algorithm, including input
storage, auxiliary storage, and recursion stack space.
Examples
O(1) - Constant Space Complexity
Example: Swapping two numbers
void swap(int &a, int &b) {
int temp = a;
a = b;
b = temp;
}
O(n) - Linear Space Complexity
Example: Storing Fibonacci series
void fibonacci(int n) {
int fib[n];
fib[0] = 0; fib[1] = 1;
for (int i = 2; i < n; i++) {
fib[i] = fib[i - 1] + fib[i - 2];
}
}
No comments:
Post a Comment