Sorting Algorithms Primer

Written by richardknop | Published 2017/06/27
Tech Story Tags: programming | golang | algorithms

TLDRvia the TL;DR App

Sorting algorithm is an algorithm that takes a list or an array and reorders its elements in a certain order. There are dozens of different sorting algorithms and if you have studied computer science, you are probably familiar with at least a couple of them. They are also a popular interview question so it doesn’t hurt to refresh your memory before important interview.

Here’s a small primer of most common sorting algorithms with example Golang implementation.

Bubble Sort

Bubble sort is the most basic of in-place sorting algorithms and the one almost everybody is familiar with. It has О(n²) worst-case and average time complexity which makes it inefficient on large lists. The implementation is very simple.

In a loop, iterate over array from the first element until the nth one where n = len(items). Compare the adjacent values and if they are in the wrong order, swap them. You can optimize the algorithm by decrementing n by one after each iteration.

Time complexity:

  • Worst case: O(n²)
  • Average case: O(n²)
  • Best case: O(n)

Space complexity:

  • Worst case: O(1)

Selection Sort

Selection sort is another simple average-case O() in-place sorting algorithm. The algorithm divides the list into two sublists, one for sorted items which starts empty and is built up left to right from the beginning of the list, and the second sublist for remaining unsorted items.

This can be achieved by two nested for loops. The outer loop iterates n time over the list where n = len(items). The inner loop will always start at the current iterator value of the outer loop (so in each iteration it will begin from a position further right in the list) and finds out the minimum value of the sublist. Swap the first item of the sublist with the found minimum.

Time complexity:

  • Worst case: O(n²)
  • Average case: O(n²)
  • Best case: O(n²)

Space complexity:

  • Worst case: O(1)

Insertion Sort

Insertion sort is a simple in-place quadratic O(n²) sorting algorithm. Again, it is less efficient on large lists but it has few advantages:

  • Adaptive: time complexity is decreased with lists which are already substantially sorted — O(nk) if each element is no more than k places away from its final sorted position
  • Stable: relative position of indexes with equal values is not changed
  • In-place: only requires a constant O(1) additional memory space
  • In practice more efficient than bubble or selection sort

Time complexity:

  • Worst case: O(n²)
  • Average case: O(n²)
  • Best case: O(n)

Space complexity:

  • Worst case: O(1)

Implementation is quite natural as it works similarly to how you would sort cards in your hand if you were playing a card game.

Shellsort

Shellsort is a generalization of insertion sort. It is an interesting sorting algorithm which works by arranging a list into set of interleaved sorted sublists.

First, choose a sequence of gaps. There are many different formulas to generate a gap sequence and average time complexity of the algorithm depends on this variable. For example, let’s choose (2^k)-1 prefixed with 1, this will give us [1, 3, 7, 15, 31, 63, …]. Reverse the sequence: […, 63, 31, 15, 7, 3, q].

Now iterate over reversed list of gaps and use insertion sort on each sublist. So in the first iteration take every 63rd element and apply insertion sort. In the second iteration take every 31st element and apply insertion sort. And so on all the way down to 1. The last iteration will run insertion sort on the entire list.

Time complexity:

  • Worst case: O(_n(log(n))_²)
  • Average case: depends on gap sequence
  • Best case: O(n(log(n)))

Space complexity:

  • Worst case: O(1)

Comb sort

Comb sort is an improvement of the bubble sort algorithm. While the bubble sort always compares adjacent elements (gap=1), comb sort starts with a gap=n/1.3 where n=len(items) and shrinks the by factor of 1.3 on every iteration.

The idea behind this improvement is to eliminate so called turtles (small values near the end of the list). The last iteration is identical to a simple bubble sort as gap=1.

Time complexity:

  • Worst case: O(n²)
  • Average case: O(n²/2^p) (p is the number of increments)
  • Best case: O(n(log(n)))

Space complexity:

  • Worst case: O(1)

Merge sort

Merge sort is a very efficient general purpose sorting algorithm. It is a divide and conquer algorithm which means the list is recursively broken into smaller lists which are sorted and then recursively combined to form the full list.

Conceptually, a merge sort works as follows:

1. Divide the unsorted list into n sublists, each containing 1 element (a list of 1 element is considered sorted).

2. Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list.

Wikipedia

Time complexity:

  • Worst case: O(n(log(n)))
  • Average case: O(n(log(n)))
  • Best case: O(n(log(n)))

Space complexity:

  • Worst case: O(n)

This is it for now. However, I will be covering more sorting algorithms in the future so bookmark this page :)

Hacker Noon is how hackers start their afternoons. We’re a part of the @AMIfamily. We are now accepting submissions and happy to discuss advertising & sponsorship opportunities.

To learn more, read our about page, like/message us on Facebook, or simply, tweet/DM @HackerNoon.

If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!


Published by HackerNoon on 2017/06/27