Sorting algorithms are among the first things you learn if you start taking algorithm classes/courses. I learned them some time ago, but with most knowledge that is not used regularly it’s doomed to be forgotten. That’s why I decided to revisit them and refresh my knowledge.

I also want to start playing with `Rust`

a bit, so this is a great way to get
into a new language. Therefore, I will write my examples in Rust and either
Python or TypeScript.

Also, I know that the term `array`

may be incorrect technically, but I will still use it here.

## What are sorting algorithms?

At some point during development, it’s necessary to sort an unsorted array from lowest to highest. You could of course use built-in functions, but where’s the fun in that? If you want to learn these algorithms yourself, the best way to train this is to sort arrays of numbers.

Speaking of built-in functions: it’s not really known what algorithms are used in the Javascript version of `sort`

, because it’s up to the implementation in the browser or the V8 engine what algorithm they want to use. In practice, it’s most likely a variation of quicksort or mergesort. Some engines switch between algorithms based on the type and size of the array being sorted, as well as other factors to balance performance and efficiency.

What’s great about learning sorting algorithms is that you gain knowledge in recursion and asymptotic notation.

### Recursion

One of the best easter eggs on Google can be found if you look for recursion. I won’t give it away, try it out yourself, but to sum up you will actually learn how recursion works!

A recursive function is a function that calls itself (with a different parameter or you’ll get a stack overflow).

Recursive functions are often used in sorting algorithms, especially if they divide the given array in some way.

### Asymptotic notation

If you want to compare the efficiency of an algorithm to a different one, it’s not possible to just compare the time it takes them to sort an array. There are just too many factors that could impair the outcome:

- how many elements are in the array?
- how fast is the CPU?
- how many other programs are currently running?

And much more… Therefore, the standard way of comparing is asymptotic notation, although you most likely will have heard of `Big O Notation`

instead.

Here’s a quick rundown of the different notations:

Notation | Description | Giveaway feature | Example |
---|---|---|---|

O(1) | constant | Independent of input size | Finding an element in an array with a given index |

O(log n) | logarithmic | Halfs the array in every iteration | Binary search |

O(n) | linear | Loops over the complete array at least once | Linear search |

O(n * log n) | loglinear | Loops over the complete array once, also halfs the array in each iteration | Mergesort |

O(n^2) | quadratic | Loops over the complete array, loops over the array again in each iteration | Bubblesort |

O(2^n) | exponential | Repeatedly solving subtasks with multiple recursions | Towers of Hanoi |

O(n!) | factorial | Calling the function recursively with the input - 1 | Calculating the factorial of a number, ironically |

With this knowledge in mind, let’s get into it!

## Bubblesort

Bubblesort is a simple and straightforward (if not the easiest) sorting algorithm that repeatedly steps through a list of elements, compares adjacent items, and swaps them if they are in the wrong order (you could also say that the higher value bubbles up). This process continues until no more swaps are needed, indicating that the list is sorted.

The swap happens in place, meaning that no new array has to be created. The runtime of this algorithm is `O(n^2)`

because we loop over the array for every element of the array, even in the optimized way we use here.

## Insertion sort

Let’s look at `Insertion sort`

next. It is a simple comparison-based sorting algorithm that works by dividing the input list into a sorted and an unsorted region. It repeatedly takes an element from the unsorted region and moves it into its correct position within the sorted region.

In the examples I’m providing, it checks if the element is smaller than the elements to its left in the sorted region. If it is, the algorithm shifts those larger elements one position to the right to make space for the current element, and then inserts the current element in its correct sorted position.

Similar to `bubblesort`

this algorithm has a runtime of `O(n^2)`

.

## Mergesort

The smaller an array is, the faster we can sort it. What is the shortest possible array? If it has one element - which is automatically sorted. But if the given array is not of that length, how do we achieve this?

Enter `Mergesort`

! This algorithm is the prime example of a `divide and conquer`

algorithm (alongside binary search). It splits the array in two halves, checks if the length is smaller than or equal to one (where it returns the array: base case!), and repeats the splitting until the array has only one element in it.

In the next step, we merge the short arrays together by comparing at the first element of the right and left arrays:

This can also sort the array in place, in the TS version I created a separate array.

Compared to `bubblesort`

and `insertion sort`

, using `mergesort`

is a huge increase of speed when looking at the runtime: `O(n * log n)`

. Mergesort is also special in this regard: the runtime is always `(n * log n)`

, even if the array is already sorted.

## Quicksort

Quicksort is another efficient sorting algorithm based on the divide-and-conquer strategy. It selects a ‘pivot’ element from the list and partitions the other elements into two sublists – those less than the pivot and those greater than the pivot. The algorithm then recursively sorts the sublists.

Quicksort is known for its average-case time complexity of O(n log n) and is widely used in practice due to its speed. However, its worst-case time complexity can be O(n^2) if not properly implemented.

## What did I learn?

Writing Rust code for the first time was more complicated than I had anticipated, but it’s no surprise coming from a high level language that does not deal with pointers, ownership and similar concepts. Excited to keep learning it!