Time complexity analysis
Other optimizations
O(1)
O(log n)
O(n)
O(n^2)
O(2^n)
Constant time algorithm.
Execution time does not change, when n increases.
"All" basic operations.
Array length, assigning to index and accessing by index.
int GetFirstAndLastSum(int[] array) { int first = array[0]; int last = array[array.Length - 1]; return first + last; }
void CalculateFixedValues(int[] array) { int first = array[0]; int last = array[array.Length - 1]; int middle = array[array.Length / 2]; int length = array.Length; int constant1 = 42; int constant2 = 7; int sum = first + last + middle; int product = first * last * middle; int result = sum + product + length + constant1 + constant2; array[1] = result; }
Not very common, binary search etc... (we will learn later)
You have 9 balls, and a balance scale. You can do 3 weighings, can you find one with bigger weight?
O(n) => ~ 9 weighings (test one per time)
You have possibilities to reduce items about half with one weighing, if you do correctly -> O(log n)
Very common with simple data structures (array)
int LinearSearch(int[] array, int target) { for (int i = 0; i < array.Length; i++) { if (array[i] == target) return i; } return -1; }
int FindMax(int[] array) { int max = array[0]; for (int i = 1; i < array.Length; i++) { if (array[i] > max) max = array[i]; } return max; }
int SumArray(int[] array) { int sum = 0; for (int i = 0; i < array.Length; i++) { sum += array[i]; } return sum; }
void ReverseArray(int[] array) { int start = 0; int end = array.Length - 1; while (start < end) { int temp = array[start]; array[start] = array[end]; array[end] = temp; start++; end--; } }
(or other exponents...)
Two (or more) nested loops
Common with simple sorting algorithms
void InsertionSort(int[] array) { int n = array.Length; for (int i = 1; i < n; i++) { int key = array[i]; int j = i - 1; while (j >= 0 && array[j] > key) { array[j + 1] = array[j]; j--; } array[j + 1] = key; } }
int[,] MultiplyMatrices(int[,] A, int[,] B) { int n = A.GetLength(0); int[,] C = new int[n, n]; for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { C[i, j] = 0; for (int k = 0; k < n; k++) { C[i, j] += A[i, k] * B[k, j]; } } } return C; }
bool HasDuplicates(int[] array) { int n = array.Length; for (int i = 0; i < n; i++) { for (int j = i + 1; j < n; j++) { if (array[i] == array[j]) return true; } } return false; }
int Fibonacci(int n) { if (n <= 1) return n; return Fibonacci(n - 1) + Fibonacci(n - 2); }
Traveling Salesman Problem (Find the shortest path between cities)
int TSP(int[,] graph, bool[] visited, int currPos, int n, int count, int cost, int ans) { if (count == n && graph[currPos, 0] > 0) { ans = Math.Min(ans, cost + graph[currPos, 0]); return ans; } for (int i = 0; i < n; i++) { if (!visited[i] && graph[currPos, i] > 0) { visited[i] = true; ans = TSP(graph, visited, i, n, count + 1, cost + graph[currPos, i], ans); visited[i] = false; } } return ans; }
If there is constant multiplier, you can let it out O(2n) => O(n)
Only the most time complex part of algorithm counts... O(n^2 + n + 4) => O(n^2)
Everything there is only estimations... So sometimes you can not simplify.
"Premature optimization is the root of all evil"
Measure your code (if it feels slow)
Compiler can do some optimizations