## Subscribe to Hacker Noon's best tech stories, delivered at noon

Visit Noonification https://noonification.com

promotedHow Quantum Computers Will Evolve by@yonatan-cohen

March 31st 2020 495 reads

Quantum computers are advancing faster than ever, but what are the challenges we still need to overcome?

The new White House budget calls for an increase in annual spending on quantum computing (QC) to reach $860 million within two years. The announcement came just a few days after India declared that they'll be investing more than a billion dollars in QC and follows similar investments by Russia, the EU and many tech giants.

The massive investments and interest in QC are well deserved. Quantum computers hold great promise for unprecedented computational power well beyond the reach of current, non-quantum technology and will be able to help solve currently unsolvable challenges in important fields including materials science, computational chemistry, optimization, AI, and cryptography. Moreover, as the technology continues to rapidly develop as it has over the last couple of decades, the once far away dream of harnessing this power seems closer now than ever. However, despite significant progress, building quantum computers powerful enough to solve real-world problems remains one of the most difficult tasks in today’s technology frontier.

However, what makes it so difficult to build a reliable and scalable quantum computer and what are the key challenges we need to overcome to make it a reality? This is what we will try to answer in this article. To do that, we must first establish how quantum computers are different than classical computers, and then examine how these differences give rise both to the extra power of quantum computers as well as to the challenges ahead.

To understand the power of quantum computers, we must first understand regular, non-quantum computers, also commonly called classical computers. A classical computer stores information in bits. Each bit in the computer is like a coin sitting on a table - it is in one of two binary states: heads or tails (more commonly the two states of a bit are called “0” and “1” rather than heads or tails). Since there are many bits in the computer, it can store A LOT of information. For example, with only 3 bits we can create an alphabet with 8 letters: ‘a’=000 (all bits are zero, or all coins are heads), ‘b’=001, ‘c’=010, ‘d’=011, etc., so that with 300 bits, we can write a text with 100 characters from this alphabet. In 1GB (gigabyte) of computer memory, there are about 8 billion bits, so you could therefore write a library of thousands of books with such a huge number of bits.

computer is much more than just a big storage device for information. What is really unique about computers is their ability to process this information. This is done in the heart of the computer - the processor. If we continue with our coin analogy, you can think of the processor as a super fast coin-flipping machine which flips bits according to particular programs that we feed it. For example, when you are reading an article on your browser, a program is running on your computer in which the processor receives bits of information sent from your mouse and flips the bits that encode the image on your screen accordingly. All programs are eventually made of a sequence of very small and basic coin-flipping operations, called gates. The computer’s processing power is measured in how many such gate operations it can perform in one second. A single core 3GHz processor can perform billions of gates every second.

But there is one thing that can destroy the computer’s operation - errors. Imagine that wind is blowing on the table full of coins and flips one of the coins. If a bit flips during the run time of a computer program, it can cause the program to give the wrong results, get stuck, or even lead to the dreaded blue screen. Fortunately, this does not happen very often. John Martinis, head of Google’s QC development, gave a great analogy that explains why the errors in classical computers are rare (and why those in quantum computers are frequent, as we will see below). In this analogy, bits in a classical computer are similar to heavy coins – it is very difficult for random environmental changes (like the wind or like electrical noise in the power supply of the computer) to flip them over. The average bit in a modern computer would work for about a billion years before it accidentally flipped.

Quantum computers replace the bits and gates with quantum bits and quantum gates. While a regular bit can be either ‘0’ or ‘1’ at any given point in time, a quantum bit, also called a **qubit**, can be in BOTH ‘0’ and ‘1’ at the same time. In fact, it can be in any combination of ‘0’ and ‘1’. (This phenomena is called superposition.) A good way to think about it is that instead of a coin sitting on a table, we are now dealing with a coin floating in outer space where there is no gravity to fix it to the table. It does not have to be heads or tails anymore, but instead can be exactly on its side, or tilted at 45 degrees angle, etc. And it gets even crazier. If we have three qubits and we, again, use our 8 letters alpha-bet (‘a’=000, ‘b’=001, etc.), then the three qubits can take the form of ALL 8 letters at the same time! And the more qubits we add, the more powerful this parallelism is - it grows exponentially with the number of qubits in the quantum computer.

Now, just like with classical computers, the story does not end at storing data. Quantum computers use quantum gates to process the quantum information stored in qubits. A quantum program, or quantum algorithm is a sequence of quantum gates designed to solve a certain problem. And due to the massive parallelism of the quantum information stored in qubits, quantum computers can, in some situations, do “exponential multi-tasking” and very quickly solve difficult problems that even the strongest super computers of today can never solve. For example, in 1994 Peter Shor developed a quantum algorithm that can solve for the prime factors of a very large number in a very short amount of time. Since most internet security today relies on the base assumption that our computers, no matter how fast they can process, cannot solve this problem in a reasonable time to cause security concerns (it would take the biggest computer in the world billion years to decrypt a simple message let alone advanced security), Shor’s algorithm demonstrated just how disruptive quantum computers could be when they become operational. Shor essentially proved that a large scale Quantum Computer could decrypt today’s security protocols in minutes or hours.

Although able to perform functions that traditional computers cannot, QCs are not without their challenges. Going back to Martinis’ analogy, while classical bits are akin to heavy coins and are stabilized by gravity to the table, qubits are floating in outer space and can be affected by almost anything. Any small disturbance can rotate the qubit and even a small one degree rotation can cause errors in the computation. And while there are many challenges in scaling up quantum computers, the high error rate, which stems from this sensitivity of qubits to their environment, is by far the greatest.

To move quantum computers forward to the point where they start living up to expectation, their propensity for errors must be overcome. Here are four important strategies that can be used to accomplish this.

QEC is considered the ultimate way of dealing with errors, because it can be proven that once the basic error rate of the physical qubits (i.e. the qubits that make up the logical qubit) is low enough, then using enough physical qubits in the logical qubit allows us to reduce the error rate of the logical qubit as much as we want. And in fact, the current error rate of physical qubits is already low enough to start doing QEC. However, for this QEC to really be effective, one would need thousands of qubits for each logical qubit, which is impractical due to the large overhead as well as other limiting bottlenecks that exist today. Thus, it is essential that in the next several years, the other three error fighting strategies continue to reduce the error rates of physical qubits, so that the overhead of QEC can be reduced to the point where error correction — and thus useful quantum computing — can become a reality.

A*bout the author *

The CTO and Co-Founder of Quantum Machines, Israeli native Dr. Yonatan Cohen has had two great passions in his life; quantum computing and entrepreneurship. Before he received his PhD studying Quantum Computing Fragility at the Weizmann Institute of Science, Yonatan co-founded and led the Weizmann Institute of Science Entrepreneurship.