We have years with the promise of a commercial quantum computer will be developed to solve complex problems at unprecedented speeds. But appear that “the future is now” with the first generation of commercial quantum computers.
But first we need to establish what is and how work a quantum computer.
First in quantum computing, a qubit or quantum bit or qbit is a unit of quantum information, analogue of the classical bit.
A qubit is a two-state quantum-mechanical system, where the two states can be vertical and horizontal polarization for example, and this states are used to encode information as 0s, 1s or both simultaneously.
One of the problems with the actual Quantum computers is that need to be near to 0 Kelvins and shield from any electromagnetic and mechanical disturbance; this is in order to allow the quantum effects can play a role in computation, the quantum processor must operate in an extreme isolated environment with refrigeration and many layers of shielding in order to create an internal environment with a temperature close to absolute zero and that is isolated from external magnetic fields, vibration, and external RF signals of any form.
The actual commercial system implements a quantum annealing algorithm, which solves problems by searching for the global minimum of a function. This is fundamentally different from the familiar framework of classical computing built on logical operations, but it is relevant in many high value problems such as minimizing error for example in a voice recognition system, learning algorithms, break cryptographic symmetric keys, controlling risk in a financial portfolio, or reducing energy loss in an electrical grid, etc.
While there are different ways in which users can submit problems to the system, at the level of the machine instruction of the quantum processor the system solves a Quadratic Unconstrained Binary Optimization Problem (QUBO), where binary variables are mapped to qubits and correlations between variables are mapped to couplings between qubits. The system of interacting qubits is evolved quantum mechanically via the annealing algorithm to find optimal or near-optimal solutions.
Solving problems with a quantum system can be thought of as trying to find the lowest point on a landscape of peaks and valleys. Every possible solution is mapped to coordinates on the landscape, and the altitude of the landscape is the “energy’” or “cost” of the solution at that point. The aim is to find the lowest point or points on the map and read the coordinates, as this gives the lowest energy, or optimal solution to the problem.
The special properties of quantum physics, such as quantum tunneling, allow the quantum computer to explore this landscape in ways that have never before been possible with classical systems. Quantum tunneling is like a layer of water that covers the entire landscape. As well as running over the surface, water can tunnel through the mountains as it looks for the lowest valley. The water is an analogy for the probability that a given solution will be returned. When the quantum computations occur, the “water” or probability is pooled around the lowest valleys. The more water in a valley, the higher the probability of that solution being returned. A classical computer, on the other hand, is like a single traveler exploring the surface of a landscape one point at a time.
Founded in 1999, D-Wave Systems is a quantum computing company that makes quantum processors based on superconducting circuits. They last processor the D-Wave 2X, with 1000 qubits, can evaluate all 21000 possible solutions at the same time.
The physical footprint of the D-Wave 2X is approximately 10’ x 7’ x 10’ (L x W x H) where It houses a cryogenic refrigeration system, shielding and I/O systems that support a single thumbnail-sized quantum processor. Most of the physical volume of the current system is due to the size of the refrigeration system. The adjoining cabinets contain the control subsystems and the front-end servers that provide connectivity to the system.
The D-Wave 2X system can be deployed as part of a High Performance Computing (HPC) data center using standard interfaces and protocols.
The I/O subsystem is responsible for passing information from the user to the processor and back. After receiving a problem from the user via standard web protocols, data is converted to analog signals and carried on normal conducting wires that transition to superconducting wires at low temperatures.
The only path for signals between the inside and outside of the shielded enclosure is a digital optical channel carrying programming information in, and results of computations out.
The processor resides in a high vacuum environment in which the pressure is 10 billion times lower than atmospheric pressure, temperature of 15 millikelvin, which is approximately 180 times colder than interstellar space, and the magnetic shielding subsystem achieves fields less than 1 nanotesla across the processor in each axis witch is approximately 50000 times less than the Earth’s magnetic field.
Some researchers think that the D-Wave 2X is not a quantum computer, is just a quantum annealer, which is only a part of a computer. The annealer’s role is to specify interactions for its qubits so they can find the lowest energy states.
After read the papers about how work and program a D-Wave 2X computer is easy to understand that computer is not a general use computer, you can’t expect go to buy a quantum version of the universal processor in your PC and play a game at quantum speeds. The D-Wave 2X design is to solve a very specific problem of find the global minimum value of a very complicated function and cannot be programmed to perform a range of tasks.
At this moment at quantum computer technology we are at the same level had by binary computers during the World War II when only can solve specific problems about projectile’s ballistic calculus or the Turing’s machine to break the Enigma code.
In conclusion my opinion and with these technical facts in mind is possible infer that we are a few decades away to create a quantum universal processor.
Julian Bolivar-Galeno is an Information and Communications Technologies (ICT) Architect whose expertise is in telecommunications, security and embedded systems. He works in BolivarTech focused on decision making, leadership, management and execution of projects oriented to develop strong security algorithms, artificial intelligence (AI) research and its applicability to smart solutions at mobile and embedded technologies, always producing resilient and innovative applications.