Sorting algorithm is an algorithm that takes a list or an array and reorders its elements in a certain order. There are dozens of different sorting algorithms and if you have studied computer science, you are probably familiar with at least a couple of them. They are also a popular interview question so it doesn’t hurt to refresh your memory before important interview. Here’s a small primer of most common sorting algorithms with example Golang implementation. Bubble Sort is the most basic of sorting algorithms and the one almost everybody is familiar with. It has О(n²) worst-case and average time complexity which makes it inefficient on large lists. The implementation is very simple. Bubble sort in-place In a loop, iterate over array from the first element until the nth one where n = len(items). Compare the adjacent values and if they are in the wrong order, swap them. You can optimize the by decrementing n by one after each iteration. algorithm Time complexity: Worst case: O(n²) Average case: O(n²) Best case: O(n) Space complexity: Worst case: O(1) Selection Sort is another simple average-case ( ) in-place sorting algorithm. The algorithm divides the list into two sublists, one for sorted items which starts empty and is built up left to right from the beginning of the list, and the second sublist for unsorted items. Selection sort O n² remaining This can be achieved by two nested for loops. The outer loop iterates n time over the list where n = len(items). The inner loop will always start at the current iterator value of the outer loop (so in each iteration it will begin from a position further right in the list) and finds out the minimum value of the sublist. Swap the first item of the sublist with the found minimum. Time complexity: Worst case: O(n²) Average case: O(n²) Best case: O(n²) Space complexity: Worst case: O(1) Insertion Sort is a simple in-place quadratic O(n²) sorting algorithm. Again, it is less efficient on large lists but it has few advantages: Insertion sort Adaptive: time complexity is decreased with lists which are already substantially sorted — O(nk) if each element is no more than k places away from its final sorted position Stable: relative position of indexes with equal values is not changed In-place: only requires a constant O(1) additional memory space In practice more efficient than bubble or selection sort Time complexity: Worst case: O(n²) Average case: O(n²) Best case: O(n) Space complexity: Worst case: O(1) Implementation is quite natural as it works similarly to how you would sort cards in your hand if you were playing a card game. Shellsort is a generalization of insertion sort. It is an interesting sorting algorithm which works by arranging a list into set of interleaved sorted sublists. Shellsort First, choose a sequence of gaps. There are many different formulas to generate a gap sequence and average time complexity of the algorithm depends on this variable. For example, let’s choose (2^k)-1 prefixed with 1, this will give us [1, 3, 7, 15, 31, 63, …]. Reverse the sequence: […, 63, 31, 15, 7, 3, q]. Now iterate over reversed list of gaps and use insertion sort on each sublist. So in the first iteration take every 63rd element and apply insertion sort. In the second iteration take every 31st element and apply insertion sort. And so on all the way down to 1. The last iteration will run insertion sort on the entire list. Time complexity: Worst case: O(_n( )_²) log(n) Average case: depends on gap sequence Best case: O( ) n( ) log(n) Space complexity: Worst case: O(1) Comb sort is an improvement of the bubble sort algorithm. While the bubble sort always compares adjacent elements (gap=1), comb sort starts with a gap=n/1.3 where n=len(items) and shrinks the by factor of 1.3 on every iteration. Comb sort The idea behind this improvement is to eliminate so called turtles (small values near the end of the list). The last iteration is identical to a simple bubble sort as gap=1. Time complexity: Worst case: O(n²) Average case: O(n²/2^p) (p is the number of increments) Best case: O( ) n( ) log(n) Space complexity: Worst case: O(1) Merge sort is a very efficient general purpose sorting algorithm. It is a which means the list is recursively broken into smaller lists which are sorted and then recursively combined to form the full list. Merge sort divide and conquer algorithm Conceptually, a merge sort works as follows: 1. Divide the unsorted list into sublists, each containing 1 element (a list of 1 element is considered sorted). n 2. Repeatedly sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list. merge — Wikipedia Time complexity: Worst case: O( ) n( ) log(n) Average case: O( ) n( ) log(n) Best case: O( ) n( ) log(n) Space complexity: Worst case: O(n) This is it for now. However, I will be covering more sorting algorithms in the future so bookmark this page :) is how hackers start their afternoons. We’re a part of the family. We are now and happy to opportunities. Hacker Noon @AMI accepting submissions discuss advertising & sponsorship To learn more, , , or simply, read our about page like/message us on Facebook tweet/DM @HackerNoon. If you enjoyed this story, we recommend reading our and . Until next time, don’t take the realities of the world for granted! latest tech stories trending tech stories