Too Long; Didn't Read
The Big O notation is a language we use to describe the time complexity of an algorithm. It’s how we compare the efficiency of different approaches to solving a problem. Algorithms with Constant Time Complexity take a constant amount of time to run, independently of the size of n. They don’t change their run-time in response to the input data, which makes them the fastest algorithms out there. The idea behind time complexity is that it can measure only the execution time of the algorithm in a way that depends only on the algorithm itself.