when we talk about data structure and algorithm it's very important to know the concepts like and these concepts help us to choose the right data structure and to know them very vital these are metrics which we use to choose the right data structure.and things like how memory ,linked list and array works,they help us to have better understanding of the speed and performance of a data structure. Big O Notation Time Complexity Why Do We Need to Know Time Complexities? A developer must choose the right data structures for the job. Choosing the right data structure is often ignored and it ends up badly impacting the performance of the application. at this tutorial, I want to teach you some important concepts in the data structure.at first, I want to talk about Big O Notation What does Big O Notation Mean? is a special notation that tells you how fast an algorithm is. It’s called because you put a “big O” in front of the number of operations (it sounds like a joke, but it’s true!). of the key operations of data structure Big O Notation Big O Notation Big O Notation the number of operations is performed in an algorithm. These operations could include iterating over a collection, an item or the entire collection, an item into a collection, an item at the start or end of a collection, an item or updating an item in a collection. is a way to measure the performance of an operation based on the copying appending inserting deleting Big O Notation input size Common Big O Notation Let’s consider to be the of the collection n size input does not depend on the , the time it takes to perform an operation is . Example: get the length in O(1): input size constant python list numbers = [ , , , , , , , , , , , , , , ] len(numbers) >>> 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 >>> 15 When the of a collection , the time it takes to perform an operation increases Example: Binary search. O(log n): size increases logarithmically. The time it takes to perform an operation is directly related to the . Example: & operation in O(n): number of items in the collection max min python list numbers = [ , , , , , , , , , , , , , , ] max(numbers) >>> 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 >>> 15 Quadratic A slow sorting algorithm. Example selection sort O(n square): A really slow algorithm, Example: the traveling salesperson O(n!): Examples to have better understanding: looking through a list in python for a particular item is . to better understand this concept, let's take look at the following example: O(n) i range(n): print(i) >>> : def function (n) ... for in ... In the preceding function, the statement will be executed input size times. Loop speed will depend on , so its complexity that's expressed using the will be . print n( ) n big O notation O(n) If the function has conditions, the correct notation to keep is the highest one, as follows: print_count: print( , n) : i range(n): print(i) ... >>> : def function (n, print_count=False) ... if ... 'count: ' ... else ... for in ... In this example, the function could be or , depending on the value of the argument. The worst case is , so the whole function complexity is . O(1) O(n) print_count O(n) O(n) is . is . is . O(1) fast O(n square) slow O(n!) very slow Big O Establishes a Worst-Case Run Time Suppose you’re using simple search to look for a person in the phone book. You know that simple search takes time to run, which means in the , you’ll have to look through every single entry in your phone book. O(n) worst case In this case, you’re looking for Ayoub. This handsome guy is the first entry in your phone book. So you didn’t have to look at every entry you found it on the first try. Did this algorithm take time? Or did it take time because you found the person on the first try? Simple search still takes time. In this case, you found what you were looking for instantly. That’s the . But is about the . So you can say that, in the , you’ll have to look at every entry in the phone book once. That’s time. It’s a reassurance.you know that simple search will never be slower than time. O(n) O(1) O(n) best-case scenario Big O notation worst-case scenario worst case O(n) O(n) How Memory Works? Imagine you have a big drawer in your home, and each drawer can hold one item in it and you want to store two things, so you ask your mother which drawer you can put your items in and your mom tells you which one to put them ! This is basically how your computer’s memory works. Your computer looks like a giant set of drawers, and each drawer has an address.whenever you want to store an item in memory, you ask the computer for some spaces, and it gives you an address where you can store your item. If you want to store multiple items, there are two basic ways to do so: and . I’ll talk about arrays and lists next, as well as the pros and cons of each. There isn’t one right way to store items for every use case, so it’s important to know the differences. arrays lists What is Linked-List and Array? Sometimes you need to store a list of elements in memory. Suppose you’re writing an app to manage your todos. You’ll want to store the todos as a list in memory. Should you use an , or a ? Let’s store the todos in an array first, because it’s easier to grasp. Using an array means all your tasks are stored (right next to each other) in memory.Now suppose you add three task and you want to add a fourth task. But the next drawer is taken up by someone else’s stuff!It’s like going to a movie with your friends and finding a place to sit but another friend joins you, and there’s no place for them. You have to move to a new spot where you all fit .In this case, you need to ask your computer for a different chunk of memory that can fit four tasks. Then you need to move all your tasks there If another friend comes by, you’re out of room again and you all have to move a second time! What a pain. Similarly, adding new items to an array can be a big pain. If you’re out of space and need to move to a new spot in memory every time, adding a new item will be really slow. One easy fix is to “hold seats”: even if you have only 3 items in your task list, you can ask the computer for 10 slots, just in case. Then you can add 10 items to your task list without having to move. This is a good workaround, but you should be aware of a couple of downsides array linked list contiguously You may not need the extra slots that you asked for, and then that memory will be wasted. You aren’t using it, but no one else can use it either You may add more than 10 items to your task list and have to move anyway. So it’s a good workaround, but it’s not a perfect solution. Linked lists solve this problem of adding items Linked Lists With linked lists, your items can be anywhere in memory. Each item stores the address of the next item in the list. A bunch of random memory addresses are linked together. It’s like a treasure hunt. You go to the first address, and it says, “The next item can be found at address 123.” So you go to address 123, and it says, “The next item can be found at address 847,” and so on. Adding an item to a linked list is easy: you stick it anywhere in memory and store the address with the previous item. . You also avoid another problem. Let’s say you go to a popular movie with five of your friends. The six of you are trying to find a place to sit, but the theater is packed. There aren’t six seats together. Well, sometimes this happens with arrays. Let’s say you’re trying to find 10,000 slots for an array. Your memory has 10,000 slots, but it doesn’t have 10,000 slots together. You can’t get space for your array! A linked list is like saying, “Let’s split up and watch the movie.” If there’s space in memory, you have space for your linked list. If linked lists are so much better at inserts, what are arrays good for? With linked lists, you never have to move your items Arrays Websites with top-10 lists use a scummy tactic to get more page views.Instead of showing you the list on one page, they put one item on each page and make you click Next to get to the next item in the list. For example, Top 10 Best TV Villains won’t show you the entire list on one page. Instead, you start at #10 (Newman), and you have to click Next on each page to reach #1 (Gustavo Fring). This technique gives the websites 10 whole pages on which to show you ads, but it’s boring to click Next 9 times to get to #1. It would be much better if the whole list was on one page and you could click each person’s name for more info. Linked lists have a similar problem. Suppose you want to read the last item in a linked list. You can’t just read it, because you don’t know what address it’s at. Instead, you have to go to item #1 to get the address for item #2. Then you have to go to item #2 to get the address for item #3. And so on, until you get to the last item. : you can read one item, follow the address to the next item, and so on. .Arrays are different. You know the address for every item in your array. For example, suppose your array contains five items, and you know it starts at address 00. What is the address of item #5? Simple math tells you: it’s 04. . Linked lists are great if you’re going to read all the items one at a time But if you’re going to keep jumping around, linked lists are terrible Arrays are great if you want to read random elements, because you can look up any element in your array instantly With a linked list, the elements aren’t next to each other, so you can’t instantly calculate the position of the fifth element in memory you have to go to the first element to get the address to the second element, then go to the second element to get the address of the third element, and so on until you get to the fifth element. summary This article was about the Big-O notation of the key operations of data structures . The big-o notation is essentially a way to measure the time complexity of an operation. The article also illustrated linked list and array and how these manage data in memory.