# Is it possible to sort an array faster than O(n*log(n)) time?

Possibly this question arises in the mind of every person who have once come across the subject of Design and Analysis of Algorithms. Generally we assume that the best time possible to sort an n element array is O(n*log(n)) and it turns out that this assumption can be proved as well.

It turns out that this is not possible when we restrict ourselves to sorting algorithms that are based on comparing array elements.
The lower bound for the time complexity can be proved by considering sorting as a process where each comparison of two elements of the array gives more information about the its contents. We can visualise the the process as a chain of sub-processes in the form of a tree.

Let us assume that “ x < y ?” means that some elements x and y in the array are compared. This expression if true evaluates that the process continues to the left, and otherwise to the right. The results of the process are the possible ways to sort the array and this amounts to n! possible ways.

1+ 2+ 3+ 4+ 5+……+n sub-processes = n! ways.

Now, taking log2 on both the sides-

log2(1)+ log2(2)+ log2(3)+ ..+ log2(n) = log2(n!)

Now we choose the last n/2 elements and and modify the value of each element to log2(n /2) and thus we can write above expression as
log2(n !) ≥ (n /2) * log2(n /2)

so the height of the tree and the minimum possible number of steps in a sorting algorithm in the worst case is at least n *log(n).

Thus we summarise the above calculation in simplified manner as-

We have a process tree having height of log(n) and at each level we have to do at least n amount of work as total array size is n. Combining these two we get total n*log(n) number of steps possible to sort an array.

reference: Competitive Programmer’s Handbook Topics of interest