Asymptotic Analysis and notation

Index

  1. Time Complexity
  2. Common DS Operations Time Complexity and Space Complexity

Time-complexity

Big-O notation: 𝑂O

It describes the asymptotic upper bound, the worst case of time complexity. i.e the max number of times a statement executes. 

Measurement of time complexity

  1. O(N) Linear TC – When the number of operation growth is proportional to input i.e same number of operation as the input size then it’s Linear TC
  2. O(N^2) – Quadratic Algorithm TC
  3. O(LogN) -when algorithm decreases the input data in each step
    • Example Binary search
  4. O(2^n) – opeation doubles with each increment in input data.

Growth Order

O(n!) > O(2 ^n) > O(n^3) > O(n^2) > O(nlogn) > O(n) > O(Log N) > O(sqrt(N)) > 1


Graph

Index

  1. Introduction
  2. Dijkstra Algorithm
  3. Bellman-ford Algorithm
  4. Floyd-Warshall Algorithm
  5. Prim’s Algorithm

Introduction

Common Terminology –

  1. Minimum Spanning Tree (MST) used in prims algorithm
    • Goal: To connect all vertices in a graph with the minimum possible total edge weight, without forming any cycles.
    • Output: A tree that spans all vertices of the graph with the minimum sum of edge weights.
  2. Shortest path
    • Goal: To find the shortest path between a specific pair of vertices in a graph, minimizing the sum of the edge weights along that path.
    • Output: A path (which may include some or all vertices) with the minimum possible total weight between two specific vertices.

Dijkstra Algorithm

  1. Purpose: Finds the shortest path from a single source to all other vertices in a graph.
  2. Applicability: Works only with graphs that have non-negative edge weights.
  3. Woking:
    • Uses a greedy approach.
    • Iteratively selects the vertex with the smallest known distance, explores its neighbors, and updates their distances.
    • Once a vertex is processed, its shortest path is finalised.

Dijkstra’s algorithm is like breadth-first search  (BFS), except we use a priority queue  instead of a normal first-in-first-out queue.  Each item’s priority is the cost of reaching it.

Note-Implementation is similar to Prim’s algorithm for minimum spanning tree.


REf-

  1. https://www.interviewcake.com/concept/java/dijkstras-algorithm (desciption working in chart)
  2. https://www.tutorialcup.com/interview/algorithm/dijkstra-algorithm.htm (application + short implemntation)
  3. https://www.interviewbit.com/tutorial/dijkstra-algorithm/ (Pseudo code + Graph Algo)
    1. https://www.interviewbit.com/tutorial/breadth-first-search/#breadth-first-search

Bellman-Ford Algorithm


  1. Purpose: Also finds the shortest path from a single source to all other vertices, but can handle graphs with negative edge weights.
  2. Applicability: Works with graphs that may have negative edge weights, and can also detect negative weight cycles (where a cycle’s total weight is negative).
  3. Woking:
    • Uses a dynamic programming approach.
    • Iteratively relaxes all edges, meaning it updates the shortest path estimate for each edge based on its current known distance.
    • After V−1V-1V−1 iterations, it ensures that the shortest paths are found. If an additional relaxation improves any distance, it indicates a negative weight cycle.



Floyd-Warshall Algorithm

  1. Purpose: Finds the shortest paths between all pairs of vertices in a graph.
  2. Applicability: Works with both directed and undirected graphs, including those with negative edge weights.
  3. Wokings:
    • Uses a dynamic programming approach.
    • Iteratively updates a distance matrix where each entry dist[i][j] represents the shortest path from vertex i to vertex j.
    • Considers each vertex as an intermediate point in the path and updates the matrix accordingly.



Prim’s Algorithm

  1. Purpose: Finds the Minimum Spanning Tree (MST) of a connected, undirected graph. The MST is a subset of the edges that connects all vertices together, without any cycles, and with the minimum possible total edge weight.
  2. Applicability: Works with weighted, undirected graphs. It requires that the graph be connected, meaning there must be a path between every pair of vertices.


Arrays Algorithms

  1. Boyer-Moore Majority Voting Algorithm
  2. Kadane’s Algo

Boyer-Moore Majority Voting Algorithm

A majority element is one that appears more than nums.length / 2 times.

Algorithm Proceeding-

  1. First, choose a candidate from the given set of elements
  2. if it is the same as the candidate element, increase the votes. Otherwise,
    • decrease the votes if votes become 0, select another new element as the new candidate.
public static int findMajorityElement(int[] nums) {
        int majorityElement = 0;
        int count = 0;
        
        for (int num : nums) {
            if (count == 0) {
                majorityElement = num;
                count = 1;
            } else if (num == majorityElement) {
                count++;
            } else {
                count--;
            }
        }
}
        

Kadane’s Algo

Kadane’s algorithm a popular dynamic programming algorithm, it is used for finding the maximum sum of a contiguous subarray in an array. it efficiently handles both positive and negative numbers.

Conditions/Cases to remember

  • does it handle empty array?[NA]
  • does it handle all -ve element, i.e what will be the ans in this case, since sum of 2 -ve is grater -ve number. [YES]
  • should we consider using 0 or arr[i] as substitution in code – Any works ,but corresponding initiation & iteration required.

It works by maintaining two variables:maxSoFar & maxEndingHere

  1. maxSoFar stores the maximum sum of any contiguous subarray seen so far, and
  2. maxEndingHere stores the maximum sum of any contiguous subarray ending at the current index.
public static int maxSubArraySum(int[] nums) {
int maxSoFar=Integer.MIN_VALUE;//final result initiated to min not 0, for cases where elements with -ve numbers would result in max being 0 -which is incorrect.
        int maxEndingHere=0;
        for(int i=0;i<nums.length;i++){
            maxEndingHere=Math.max(nums[i]+maxEndingHere,nums[i]);
            maxSoFar=Math.max(maxSoFar,maxEndingHere);
        }
      return maxSoFar;  
}

Sorting Algorithms

Index

  1. Time Complexities
  2. Selection sort
  3. Bubble sort
  4. Insertion sort
  5. Quick Sort sort
  6. Merge Sort sort
  7. Heap sort

Time-Complexity of Sorting-Algorithms

ALGORITHMWORST-case Time Complexity(Big O Notation)AVERAGE-case time complexity (Theta Notation)BEST-case time-complexity (Omega Notation)
Selection sortingO(n^2)θ(n^2)Ω(n^2)
Bubble sortingO(n^2)θ(n^2)Ω(n)
Insertion sortingO(n^2)θ(n^2)Ω(n)
Heap sortingO(nlog(n))θΩ
Quick sortingO(n^2)θΩ
Merge sortingO(nlog(n))θΩ
Bucket sortingO(n^2)θΩ
Radix sortingO(nk)θΩ
Tree-sort O(nlog(n))

Working of Sorting Algorithms


Usage : Selection vs Insertion vs Bubble sort

  1. Selection Sort is best when memory write operations are costly since it makes the minimum possible number of swaps compared to other two.
  2. Insertion Sort is preferable for small or nearly sorted data.
  3. Bubble Sort is typically used for educational purposes to demonstrate sorting techniques but is rarely used in practical applications due to its inefficiency.

Selection Sorting


Algorithm :

  1. Iterate through the array to find the smallest (or largest) element.
  2. Swap the found minimum (or maximum) element with the first element.
  3. Move the boundary of the unsorted subarray by one element.
  4. Repeat the process for the remaining unsorted portion of the array.

  private static void selectionSort(int[] arr) {
        int n = arr.length;
        for (int i = 0; i < n; i++) {
            int minIndex = i;
            for (int j = i + 1; j < n; j++) {
                if (arr[j] < arr[minIndex]) {
                    minIndex = j;
                }
            }
//optimisation used :  outer loop used for swapping - internal for index tracking
            int temp = arr[minIndex];
            arr[minIndex] = arr[i];
            arr[i] = temp;

        }
    }
  1. Time Complexity: O(n²)
  2. Space Complexity: O(1) (In-place sorting)

Bubble Sort

Algorithm :

  1. Compare adjacent elements in the array.
  2. Swap them if they are in the wrong order.
  3. Continue doing this for each pair of adjacent elements until the array is fully sorted
  4. .After each pass, the next largest element is in its correct position.
  5. Repeat the process until no swaps are needed (the array is sorted).

  private static void bubbleSort(int[] arr) {
        int n = arr.length;
        for (int i = 0; i < n - 1; i++) {

            for (int j = 0; j < n - 1; j++) {
                if(arr[j]>arr[j+1]){
                    int temp=arr[j+1];
                    arr[j+1]=arr[j];
                    arr[j]=temp;
                }
            }
        }
     
    }
  1. Time Complexity: O(n²)
  2. Space Complexity: O(1) (In-place sorting)

Insertion Sort

Algorithm :

  1. Start from the second element, assume that the first element is already sorted.
  2. Pick the next element and compare it with the elements in the sorted part.
  3. Shift all elements in the sorted part that are greater than the current element to the right.
  4. Insert the current element into its correct position.
  5. Repeat the process until the entire array is sorted.

 public static void insertionSort(int[] arr) {
        int n = arr.length;
        for (int i = 1; i < n; i++) {
            int key = arr[i];
            int j = i - 1;
            // Move elements of arr[0..i-1] that are greater than key to one position ahead
            while (j >= 0 && arr[j] > key) {
                arr[j + 1] = arr[j];
                j = j - 1;
            }
            arr[j + 1] = key;
        }
    }
  1. Time Complexity: O(n²)
  2. Space Complexity: O(1) (In-place sorting)

Quick Sort

Algorithm :

  1. Choose a “pivot” element from the array.
  2. Partition the array into two sub-arrays: elements less than the pivot and elements greater than the pivot.
  3. Recursively apply the same steps to the sub-arrays.
  4. Combine the sub-arrays to get the sorted array.
  public static void quickSort(int[] arr, int low, int high) {
        if (low < high) {
            int pi = partition(arr, low, high);
            quickSort(arr, low, pi - 1);  // Recursively sort elements before partition
            quickSort(arr, pi + 1, high); // Recursively sort elements after partition
        }
    }

    private static int partition(int[] arr, int low, int high) {
        int pivot = arr[high];
        int i = (low - 1); // Index of smaller element
        for (int j = low; j < high; j++) {
            if (arr[j] < pivot) {
                i++;
                // Swap arr[i] and arr[j]
                int temp = arr[i];
                arr[i] = arr[j];
                arr[j] = temp;
            }
        }
        // Swap arr[i+1] and arr[high] (or pivot)
        int temp = arr[i + 1];
        arr[i + 1] = arr[high];
        arr[high] = temp;

        return i + 1;
    }
  1. Time Complexity:
    • Best Case: O(n log n)
    • Worst Case: O(n²) (occurs when the smallest or largest element is always chosen as the pivot)
  2. Space Complexity: O(log n) (in-place sorting, though recursion stack takes

Merge Sort

Algorithm :

  1. Divide the array into two halves.
  2. Recursively sort each half.
  3. Merge the two halves to create a sorted array.
public class MergeSort {
    public static void mergeSort(int[] arr, int left, int right) {
        if (left < right) {
            int middle = (left + right) / 2;
            // Sort first and second halves
            mergeSort(arr, left, middle);
            mergeSort(arr, middle + 1, right);
            // Merge the sorted halves
            merge(arr, left, middle, right);
        }
    }

    private static void merge(int[] arr, int left, int middle, int right) {
        int n1 = middle - left + 1;
        int n2 = right - middle;

        // Temporary arrays
        int[] L = new int[n1];
        int[] R = new int[n2];

        // Copy data to temp arrays
        for (int i = 0; i < n1; i++)
            L[i] = arr[left + i];
        for (int j = 0; j < n2; j++)
            R[j] = arr[middle + 1 + j];

        // Merge the temp arrays
        int i = 0, j = 0;
        int k = left;
        while (i < n1 && j < n2) {
            if (L[i] <= R[j]) {
                arr[k] = L[i];
                i++;
            } else {
                arr[k] = R[j];
                j++;
            }
            k++;
        }

        // Copy remaining elements of L[]
        while (i < n1) {
            arr[k] = L[i];
            i++;
            k++;
        }

        // Copy remaining elements of R[]
        while (j < n2) {
            arr[k] = R[j];
            j++;
            k++;
        }
    }

    public static void main(String[] args) {
        int[] arr = {12, 11, 13, 5, 6, 7};
        
        // Call mergeSort with the full range of the array
        mergeSort(arr, 0, arr.length - 1);
        
        // Print the sorted array
        System.out.println("Sorted array:");
        for (int num : arr) {
            System.out.print(num + " ");
        }
    }
}


Heap sort


Algorithm :


Introduction

  1. Asymptotic Analysis and notation
  2. Sorting Algorithms- nlogn
    1. Insertion sort
    2. Selection sort
    3. Bubble sort
    4. Merge SOrt
    5. Quick Sort
    6. Heap sort
  3. Searching Algorithms (log(n)-tree)
    1. Linear search
    2. Binary search
    3. DFS
    4. BFS
  4. Arrays
    1. Boyer-Moore Majority Voting Algorithm
    2. Kadane’s Algo
    3. Floyd’s cycle detection
    4. KMP
    5. Quick select
  5. Graph
    1. Dijkstra Algorithm
    2. Kruskal Algo
    3. Bellman ford Algo
    4. floyd warshall Algo
    5. topological sort Algo
    6. Flood fill Algo
    7. lee Algo
  6. Basic Algos
    1. Euclids Algo
    2. Union Find Algo
    3. Huffman coding compression Algo
S.No.AlgorithmAlgorithm Design paradigm
1Dijkstra shortest pathBFS
2Floyd-Warshall algorithm – to compute all-pairs shortest pathGreedy design
3Binary search on sorted arrayDynamic programming
4Backtracking search on graphDivide and conquer
5DFS
Match the correct once

Searching Algorithms

  1. Linear search
  2. Binary search

Binary Search

 Search a sorted array by repeatedly dividing the search interval (Divide and conquer )in half. Begin with an interval covering the whole array. If the value of the search key is less than the item in the middle of the interval, narrow the interval to the lower half. Otherwise, narrow it to the upper half. Repeatedly check until the value is found or the interval is empty.

Algorithm – We basically ignore half of the elements just after one comparison.

  1. Compare x(element to be searched for) with the middle element.
  2. If x matches with the middle element, we return the mid index.
  3. Else If x is greater than the mid element, then x can only lie in the right half subarray after the mid element. So we recur for the right half.
  4. Else (x is smaller) recur for the left half.

Two approaches-

  • Iterative – tip – for simplicity of understanding go with this
  • Recursive (ignored as slightly complex to understand and implement )

Iterative approach –

int binarySearch(int arr[], int x)
    {
        int l = 0, r = arr.length - 1;
        while (l <= r) {
            int m = l + (r - l) / 2;

            // Check if x is present at mid
            if (arr[m] == x)
                return m;

            // If x greater, ignore left half
            if (arr[m] < x)
                l = m + 1;

            // If x is smaller, ignore right half
            else
                r = m - 1;
        }

        // if we reach here, then element was
        // not present
        return -1;
    }