Quantum computing promises ultra-fast computation and the ability to solve complex problems that even a supercomputer is no match for. What exactly is a quantum computing? How does a quantum computer work? How close are we to experiencing one? Before we dive into the mysteries of quantum mechanics, it will be helpful to understand how the computers we?re currently using work.
A Bit is the smallest unit of information on a traditional computer. A modern CPU contains tens of billions of transistors, each acting as a switch that is either on or off; 1 or 0. The computer languages we use to manipulate these bits is all based on boolean logic. Not surprisingly, boolean logic is also binary. Our traditional programs have many complex operations, but they are all based on true or false states. An application does one thing if an expression is true, and another if it is false.
Every feature of your computer is the result of ones, zeros, true or false questions, and variables, along with ?AND? ?OR? and ?NOT? operations. In the binary world, everything is black and white. It?s all very concrete and predictable. One always equals one, and zero always equals zero. That makes programming relatively straightforward, and consistent. With these basic building blocks, developers have implemented an astonishing variety of applications.
In classical physics, everything follows a rigid set of mathematical laws, interpreted by boolean logic. On its surface, reality appears to be a giant machine operating in a deterministic fashion, like Isaac Newton’s ideal of a clockwork universe. This type of reasoning is useful for understanding and negotiating the world at a macroscopic level. In our day to day experience, everything has cause and effect, behaving in accordance with a strict set of laws. However, these laws begin to break down when we examine reality at the atomic scale.
One of the earliest discoveries leading to the development of quantum theory was that of wave-particle duality. Scientists found that light exhibits properties of both a particle and a wave. In fact, they subsequently discovered that all particles possess wave properties, and all waves possess particle properties. But they only display one of these properties at a time, depending on how they are measured. Although it?s convenient to use traditional categorizations, it seems that all matter is both particle and wave at once.
Other researchers found that when wave\particles interact they can become entangled, and any action made to one of them will affect the other, regardless of the distance between them. Additionally, quantum particles are said to exist in all possible states at once, which is known as superposition. In the example of a light wave, it is thought to exist in both wave-form and particle-form at the same time. As soon as we make a measurement to determine its qualities, however, this superposition breaks down, and it takes on one form or the other.
Scientists do not yet understand how or why quantum effects behave the way they do. In fact, the limitations of classical computers for simulating quantum systems are what gave birth to the idea of quantum computing. In 1982, Nobel Prize winner Richard Feynman proposed that if we want to be able to understand quantum effects, we need to build a computer utilizing those effects. That, in and of itself, is an incredible thought. Our machines are not powerful enough to model nature at a quantum level. To create a computer capable of modeling quantum behavior, we need to make use of those very properties we don?t understand!
For years, the idea of a quantum computer remained mostly theoretical. Recent developments, however, brought life to the quest for creating one. The most pressing of these developments came in 1994 when Peter Shor designed an algorithm for factoring large numbers with a quantum computer. Theoretically this algorithm could break current cryptographic methods in an instant. Shor?s algorithm resulted in an entirely new field of study dedicated to quantum-resistant cryptography, and inspired a race across the globe to create a computer capable of running it.
Part of the problem of working with quantum bits (qubits) is that when we measure them, the unique quantum effects are lost. Without superposition (being in multiple states at once) one of the most valuable properties for calculation disappears. That makes obtaining results from a quantum computer just as tricky as making calculations with them. Another challenge is that boolean logic is useless with quantum properties, and we must create an entirely new set of programming languages based on quantum logic.
Our traditional bits hold a value of either one or zero. Superposition allows a qubit to carry one, zero, or both values at once! When combined with entanglement (two quantum particles becoming tied to the same state regardless of distance), it is possible to encode up to two bits of information in a single qubit. This technique is known as superdense coding and is not merely a theoretical notion. This capability has been achieved, and the methods of achieving it are improving.
There is no single way to work with qubits, and there are many different efforts towards making quantum computing a reality. Scientists commonly freeze individual molecules to nearly absolute zero. Otherwise, the molecule has too much energy and any data it stored is lost.
The first commercial quantum computing device was introduced by D-Wave when Lockheed Martin signed a contract with them in 2011. Unfortunately, D-Wave has suffered much controversy throughout their existence. That isn’t surprising, when you consider that they were the first private company to work on the problem. D-wave?s 4th generation quantum computer was recently installed at the Quantum Artificial Intelligence Lab run by Google, NASA, and Universities Space Research Association. They also plan to offer quantum computing on the public cloud sometime this year. Their systems include as many as 2000 qubits but are designed for precise and limited function. The developments of D-Wave are typically considered to be inferior to the models by IBM and Google which utilize far fewer qubits.
50 years ago, IBM began laying the foundations for developments in quantum computing with their advances in material science. In 2016, they were the first to release quantum functionality to the cloud, making a five-qubit model available to researchers around the globe. The next year IBM released an API making it possible for developers to begin designing programs for their public systems without requiring in-depth knowledge of quantum logic. In February, they unveiled a 50 qubit chip that was said to make them a front-runner in the race for practical quantum computing. Not to be outdone, on March 6th, Google released a 72 qubit processor they call Bristlecone.
These developments have led to a great deal of excitement, as well as fear. Although there has been research to create quantum-resistant encryption methods, none of our popular algorithms employ them. That fact alone is cause for concern and gives an incredible, and scary, amount of power to the first creator of a practical quantum computer. The question isn?t ?should I be scared?? The real question is ?how scared should I be??
There are many challenges towards developing a practical quantum computer. For one, the unique features of a quantum bit break down incredibly easily. The larger your collection of qubits become, more possibilities for interference are introduced. Another issue is that qubits are incredibly prone to noise and error. The more qubits you bring together, the higher your error rate is likely to become. There are some techniques for addressing these issues, but those techniques are still rudimentary. To date, the longest this quantum state has been achieved for is 90 seconds.
Many people are optimistic about a quantum breakthrough happening soon. That’s not surprising considering all of the positive media attention quantum research is getting. However, it seems just as likely, if not more, that we won?t experience practical quantum computing any time soon.