Computational complexity is a tricky subject to wrap one’s head around. Understanding runtime allows one to see how well a particular algorithm or data structure could scale within a computer. Unfortunately, it may be a bit abstract to try and think like a computer — for example, what does it mean to say that accessing a node in a linked list is O(n) while inserting a node is the more preferred O(1)? To truly get these concepts, let’s use real life (err.. Super Nintendo) examples to figure out what these terms mean.

Here is a graph that depicts the scalability of different runtime complexities. While all runtimes begin at the same point at the bottom lefthand corner, adding more elements changes how efficient each is. While it might not make a difference whether or not you use an array to store some data in your mini-side project, perhaps that same project might not scale as well once you finally successfully gain the attention of some wealthy investors.

### O(1)

O(1), pronounced “oh of one” is simply the best. If it were personified, it would be Michael Jordan. Unless you had access to a time machine, there is no way to have a better runtime.

How does it work? Imagine the year is 1991 and you are getting your first experience playing a brand new game called Street Fighter II. Your friend tells you that you should try Guile, because his Sonic Boom is pretty awesome. Unfortunately, the only information you have is the name Guile. You have no idea what he or she looks like and it’s another seven years before Google is founded. You could do a bit of deducing, sure, but you decide there is a quicker way. All you have to do is ask your friend, “Which one of these characters is Guile?”

Your friend responds by pointing to Guile after just one question. O(1) means just that — your problem is solved in constant time. It doesn’t matter whether or not there are 8 characters or 800, this is always completed instantaneously.

### O(log n)

As effective as the previous method was, the real world is usually not as smooth. The next best thing may be O(log n).

O(log n) algorithms are best described as never having to look at the entirety of an input. As in a binary search, O(log n) is basically taking data, splitting it in half, then taking the data you are left with, and splitting that in half. This is done until you reach your answer.

To use the Street Fighter example, imagine you ask your friend “Is Guile in the top row?” This drops the original eight fighters to four. By constantly cutting the total number of choices in half, you can eventually reach Guile.

### O(n)

O(n) is perhaps the easiest of runtimes to comprehend. The bigger the input, the more time this will take, and they increase at the same rate. In the O(1) section, we discovered the most effective way of figuring out which character is Guile by asking one question. To use O(n), we would ask our friend about each character individually. Starting with the first character, we would ask “is this Guile?” If he says yes, we are done (this is the best-cased scenario). If not, we move onto the next character. With just eight characters to consider, it would take just eight questions (at worst). However, if the input becomes larger, such as if we add eight MORE characters, the runtime becomes slower.

### O(n²)

The complexity of O(n²) is where things get a bit trickier. If you take a look at the Big O graph at the top of this article, you’ll notice that the line that depicts this particular runtime is curved, due to it being polynomial time.

Quadratic or polynomial runtimes usually contain nested for-loops. One nested for-loop implies O(n²) while a second nested for-loop would mean O(n³) and so on (with possibly but unlikely exceptions).

How does this work in our Street Fighter example? Imagine you are able to ask the actual characters themselves questions. You approach the first character and ask if his name is Guile. If he responds “yes, I am” (the best-cased scenario), then you are done. However, if his response is “no,” this begins a nested for-loop. This character (let’s say Ryu) then becomes the question master and approaches a second character to do the asking for you. Similarly, he asks this second character (let’s say Ken) one at a time whether or not he is Guile. The process continues until the original question’s response is “yes.” As the graph implies, the larger data makes things last a tad longer.

### O(2^n) and O(n!)

Two less common runtime are the exponential O(2^n) and the seemingly forever factorial O(n!). If a program is running in either of these, it would be advised to consider a new approach.

O(2^n) is usually seen during a recursive function where you call a function twice within itself such as the Fibonacci example here:

`fib(n) = { return fib(n-1) + fib(n-2) }`

Big O is a lot more complex than a Street Fighter selection screen, but understanding the basics is a good start. Like with Super Nintendo, understanding the more complex aspects of Big O requires plenty of practice and perhaps a bit of reading the instructional booklet.

Thanks for reading. Please subscribe if you enjoyed this article.