You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

6.2 KiB

Selection sort

It is an inplace sorting technique. In this algorithm, we will get the minimum element from the array, then we swap it to the first position. Now we will get the minimum from array[1:] and place it in index 1. Similarly, we get minimum from array[2:] and then place it on index 2. We do till we get minimum from array[len(array) - 2:] and place minimum on index [len(array) - 2].

  void selection_sort(int array[]){
    for( int i = 0; i < len(array)-2 ; i++ ) {
      /* Get the minimum index from the sub-array [i:] */
      int min_index = i;
      for( int j = i+1; j < len(array) - 1; j++ )
	if (array[j] < array[min_index]) { min_index = j; }

      /* Swap the min_index with it's position at start of sub-array */
      array[i], array[min_index] = array[min_index], array[i];
    }
  }

Time complexity

The total number of comparisions is, \[ \text{Total number of comparisions} = (n -1) + (n-2) + (n-3) + ... + (1) \] \[ \text{Total number of comparisions} = \frac{n(n-1)}{2} \] For this algorithm, number of comparisions are same in best, average and worst case. Therefore the time complexity in all cases is, \[ \text{Time complexity} = \theta (n) \]

  • Recurrance time complexity : $T(n) = T(n-1) + n - 1$

Insertion sort

It is an inplace sorting algorithm.

  • In this algorithm, we first divide array into two sections. Initially, the left section has a single element and right section has all the other elements. Therefore, the left part is sorted and right part is unsorted.
  • We call the leftmost element of the right section the key.
  • Now, we insert the key in it's correct position, in left section.
  • As commanly known, for insertion operation we need to shift elements. So we shift elements in the left section.
  void insertion_sort ( int array[] ) {
    for( int i = 1; i < len(array); i++ ) {
      /* Key is the first element of the right section of array */
      int key = array[j];
      int j = i - 1;

      /* Shift till we find the correct position of the key in the left section */
      while ( j > 0 && array[j] > key ) {
	array[j + 1] = array[j];
	j -= 1;
      }
      /* Insert key in it's correct position */
      array[j+1] = key;
    }
  }

Time complexity

Best Case : The best case is when input array is already sorted. In this case, we do (n-1) comparisions and no swaps. The time complexity will be $\theta (n)$ \\ Worst Case : The worst case is when input array is is descending order when we need to sort in ascending order and vice versa (basically reverse of sorted). The number of comparisions is \\ \[ [1 + 2 + 3 + .. + (n-1)] = \frac{n(n-1)}{2} \] \\ The number of element shift operations is \\ \[ [1 + 2 + 3 + .. + (n-1)] = \frac{n(n-1)}{2} \] \\ Total time complexity becomes $\theta \left( 2 \frac{n(n-1)}{2} \right)$, which is simplified to $\theta (n^2)$.

  • NOTE : Rather than using linear search to find the position of key in the left (sorted) section, we can use binary search to reduce number of comparisions.

Inversion in array

The inversion of array is the measure of how close array is from being sorted. \\ For an ascending sort, it is the amount of element pairs such that array[i] > array[j] and i < j OR IN OTHER WORDS array[i] < array[j] and i > j.

  • For ascending sort, we can simply look at the number of elements to left of the given element that are smaller.
Array 10 6 12 8 3 1
Inversions 4 2 3 2 1 0

Total number of inversions = (4+2+3+2+1+0) = 12

  • For descending sort, we can simply look at the number of elements to the left of the given element that are larger.
Array 10 6 12 8 3 1
Inversions 1 2 0 0 0 0

Total number of inversions = 1 + 2 = 3

  • For an array of size n

\[ \text{Maximum possible number of inversions} = \frac{n(n-1)}{2} \] \[ \text{Minimum possible number of inversions} = 0 \]

Relation between time complexity of insertion sort and inversion

If the inversion of an array is f(n), then the time complexity of the insertion sort will be $\theta (n + f(n))$.

Quick sort

It is a divide and conquer technique. It uses a partition algorithm which will choose an element from array, then place all smaller elements to it's left and larger to it's right. Then we can take these two parts of the array and recursively place all elements in correct position. For ease, the element chosen by the partition algorithm is either leftmost or rightmost element.

  void quick_sort(int array[], int low, int high){
    if(low < high){
      int x = partition(array, low, high);
      quick_sort(array, low, x-1);
      quick_sort(array, x+1, high);
    }
  }

As we can see, the main component of this algorithm is the partition algorithm.

Lomuto partition

The partition algorithm will work as follows:

  /* Will return the index where the array is partitioned */
  int partition(int array[], int low, int high){
    int pivot = array[high];
    /* This will point to the element greater than pivot */
    int i = low - 1;

    for(int j = low; j < high; j++){
      if(array[j] <= pivot){
	i += 1;
	array[i], array[j] = array[j], array[i];
      }
    }

    array[i+1], array[high] = array[high], array[i+1];
    return (i + 1);
  }
  • Time complexity

For an array of size n, the number ofcomparisions done by this algorithm is always n - 1. Therefore, the time complexity of this partition algorithm is, \[ T(n) = \theta (n) \]

Time complexity of quicksort

  • Best Case : The partition algorithm always divides the array to two equal parts. In this case, the recursive relation becomes \[ T(n) = 2T(n/2) + \theta (n) \] Where, $\theta (n)$ is the time complexity for creating partition. \\ Using the master's theorem. \[ T(n) = \theta( n.log(n) ) \]
  • Worst Case : The partition algorithm always creates the partition at one of the extreme positions of the array. This creates a single partition with n-1 elements. Therefore, the quicksort algorithm has to be called on the remaining n-1 elements of the array. \[ T(n) = T(n-1) + \theta (n) \] Again, $\theta (n)$ is the time complexity for creating partition. \\ Using master's theorem \[ T(n) = \theta (n^2) \]
  • Average Case :