Second commit

main
lomna 1 year ago
parent 68c45a2b53
commit 35a77a52ed

@ -20,7 +20,7 @@ A set of instructions in a loop. Iterative instructions can have different compl
\\ \\
+ If *inner loop iterator doesn't depend on outer loop*, the complexity of the inner loop is multiplied by the number of times outer loop runs to get the time complexity For example, suppose we have loop as + If *inner loop iterator doesn't depend on outer loop*, the complexity of the inner loop is multiplied by the number of times outer loop runs to get the time complexity For example, suppose we have loop as
#+BEGIN_SRC #+BEGIN_SRC C
for(int i = 0; i < n; i++){ for(int i = 0; i < n; i++){
... ...
for(int j = 0; j < n; j *= 2){ for(int j = 0; j < n; j *= 2){
@ -35,7 +35,7 @@ Thus the time complexity is *O(n.log(n))*.
+ If *inner loop and outer loop are related*, then complexities have to be computed using sums. Example, we have loop + If *inner loop and outer loop are related*, then complexities have to be computed using sums. Example, we have loop
#+BEGIN_SRC #+BEGIN_SRC C
for(int i = 0; i <= n; i++){ for(int i = 0; i <= n; i++){
... ...
for(int j = 0; j <= i; j++){ for(int j = 0; j <= i; j++){
@ -70,7 +70,7 @@ total number of times inner loop runs = $\frac{n^2}{2} + \frac{n}{2}$
*Another example,* *Another example,*
\\ \\
Suppose we have loop Suppose we have loop
#+BEGIN_SRC #+BEGIN_SRC C
for(int i = 1; i <= n; i++){ for(int i = 1; i <= n; i++){
... ...
for(int j = 1; j <= i; j *= 2){ for(int j = 1; j <= i; j *= 2){
@ -108,7 +108,7 @@ Time complexity = $O(n.log(n))$
** An example for time complexities of nested loops ** An example for time complexities of nested loops
Suppose a loop, Suppose a loop,
#+BEGIN_SRC #+BEGIN_SRC C
for(int i = 1; i <= n; i *= 2){ for(int i = 1; i <= n; i *= 2){
... ...
for(int j = 1; j <= i; j *= 2){ for(int j = 1; j <= i; j *= 2){

@ -27,15 +27,16 @@ Divide and conquer is a problem solving strategy. In divide and conquer algorith
Recursive approach Recursive approach
#+BEGIN_SRC python #+BEGIN_SRC C
# call this function with index = 0 // call this function with index = 0
def linear_search(array, item, index): int linear_search(int array[], int item, int index){
if len(array) < 1: if( index >= len(array) )
return -1 return -1;
elif array[index] == item: else if (array[index] == item)
return index return index;
else: else
return linear_search(array, item, index + 1) return linear_search(array, item, index + 1);
}
#+END_SRC #+END_SRC
*Recursive time complexity* : $T(n) = T(n-1) + 1$ *Recursive time complexity* : $T(n) = T(n-1) + 1$
@ -138,16 +139,21 @@ Recursive approach:
* Max and Min element from array * Max and Min element from array
** Straightforward approach ** Straightforward approach
#+BEGIN_SRC python #+BEGIN_SRC C
def min_max(a): struc min_max {int min; int max;}
max = min = a[1] min_max(int array[]){
for i in range(2, n): int max = array[0];
if a[i] > max: int min = array[0];
max = a[i];
elif a[i] < min: for(int i = 0; i < len(array); i++){
min = a[i]; if(array[i] > max)
max = array[i];
return (min,max) else if(array[i] < min)
min = array[i];
}
return (struct min_max) {min, max};
}
#+END_SRC #+END_SRC
+ *Best case* : array is sorted in ascending order. Number of comparisions is $n-1$. Time complexity is $O(n)$. + *Best case* : array is sorted in ascending order. Number of comparisions is $n-1$. Time complexity is $O(n)$.

@ -15,6 +15,8 @@ It is an inplace sorting technique. In this algorithm, we will get the minimum e
} }
#+END_SRC #+END_SRC
** Time complexity
The total number of comparisions is, The total number of comparisions is,
\[ \text{Total number of comparisions} = (n -1) + (n-2) + (n-3) + ... + (1) \] \[ \text{Total number of comparisions} = (n -1) + (n-2) + (n-3) + ... + (1) \]
\[ \text{Total number of comparisions} = \frac{n(n-1)}{2} \] \[ \text{Total number of comparisions} = \frac{n(n-1)}{2} \]
@ -48,7 +50,7 @@ It is an inplace sorting algorithm.
} }
#+END_SRC #+END_SRC
+ Time complexity ** Time complexity
*Best Case* : The best case is when input array is already sorted. In this case, we do *(n-1)* comparisions and no swaps. The time complexity will be $\theta (n)$ *Best Case* : The best case is when input array is already sorted. In this case, we do *(n-1)* comparisions and no swaps. The time complexity will be $\theta (n)$
\\ \\
@ -90,4 +92,59 @@ Total number of inversions = 1 + 2 = 3
If the inversion of an array is f(n), then the time complexity of the insertion sort will be $\theta (n + f(n))$. If the inversion of an array is f(n), then the time complexity of the insertion sort will be $\theta (n + f(n))$.
* Quick sort * Quick sort
It is a divide and conquer technique. It is a divide and conquer technique. It uses a partition algorithm which will choose an element from array, then place all smaller elements to it's left and larger to it's right. Then we can take these two parts of the array and recursively place all elements in correct position. For ease, the element chosen by the partition algorithm is either leftmost or rightmost element.
#+BEGIN_SRC C
void quick_sort(int array[], int low, int high){
if(low < high){
int x = partition(array, low, high);
quick_sort(array, low, x-1);
quick_sort(array, x+1, high);
}
}
#+END_SRC
As we can see, the main component of this algorithm is the partition algorithm.
** Lomuto partition
The partition algorithm will work as follows:
#+BEGIN_SRC C
/* Will return the index where the array is partitioned */
int partition(int array[], int low, int high){
int pivot = array[high];
/* This will point to the element greater than pivot */
int i = low - 1;
for(int j = low; j < high; j++){
if(array[j] <= pivot){
i += 1;
array[i], array[j] = array[j], array[i];
}
}
array[i+1], array[high] = array[high], array[i+1];
return (i + 1);
}
#+END_SRC
+ Time complexity
For an array of size *n*, the number ofcomparisions done by this algorithm is always *n - 1*. Therefore, the time complexity of this partition algorithm is,
\[ T(n) = \theta (n) \]
** Time complexity of quicksort
+ *Best Case* : The partition algorithm always divides the array to two equal parts. In this case, the recursive relation becomes
\[ T(n) = 2T(n/2) + \theta (n) \]
Where, $\theta (n)$ is the time complexity for creating partition.
\\
Using the master's theorem.
\[ T(n) = \theta( n.log(n) ) \]
+ *Worst Case* : The partition algorithm always creates the partition at one of the extreme positions of the array. This creates a single partition with *n-1* elements. Therefore, the quicksort algorithm has to be called on the remaining *n-1* elements of the array.
\[ T(n) = T(n-1) + \theta (n) \]
Again, $\theta (n)$ is the time complexity for creating partition.
\\
Using master's theorem
\[ T(n) = \theta (n^2) \]
+ *Average Case* :

File diff suppressed because it is too large Load Diff
Loading…
Cancel
Save