Quantum Computing for HEP ? (1/2)

Quantum Computing (QC) is now an “old” research thread initiated back in the 80’s. It has long been a mere academic endeavor with only a faint hope to become of any practical use, at least in the short term.  But, today, with Google, Microsoft, IBM and others entering the game, the field has attracted much publicity and recent technical progress, like the D:Wave 2x 1000-qubit  or the 5-qubit IBM chip have raised new expectations.  Europe also is joining the trend planning a giant billion € quantum technologies project (see here and the quantum manifesto).

Will High-Energy Physics (HEP) research benefit from this budding but highly disruptive technology? After all, it was Richard Feynman, famous for inventing a graphical method to compute particle interactions (today  at the heart of HEP simulation) who, in 1981, proposed a basic QC model to evaluate quantum processes to a precision unattainable on classical computers.

Would the highest precision needed both to match the coming collider experimental measurements and to probe possible new physics be provided by QC as in Feynman’s dream? Would the more mundane QC algorithms like large number factorization or ultra-fast database search find applications in the simulation or data analysis of HEP frontier experiments?

The QC state of the art, in 2006, has been covered in the ACAT workshop on a talk “Quantum Computing in Physics Research” from B. Georgeot (CNRS) (slides). But a lot, both on the hardware and algorithmic sectors, has happened since then and the impact on physics research and more particularly on HEP needs to be revisited.

The basic principle of QC is simple. Instead of using series of bits, each valued 0 or 1, to form binary coded data like in the conventional CPU, quantum computing uses qubits in a two-state (0,1) quantum superposition. Quantum superposition, along with quantum entanglement and quantum tunneling, enable quantum computers to consider and manipulate all combinations of bits simultaneously, making quantum computation powerful and fast1. When the n-qubit are entangled they are in a superposition of 2^n states simultaneously. Although a classical computer can be only in one of these 2^n states. But the main difficulty lies on how to execute an external algorithm on a set of entangled qubit keeping them entangled, i.e. highly isolated from any external perturbation (noise). The final observation of the n-qubit will collapse the entangled state and the result will be known, at least in a probabilistic way.

This approach makes it more efficient to track, in parallel, many possible solutions and should solve problems with high complexity. This is similar to what happens with photons in a double slit experiment, they may follow different trajectories to reach the screen, although for each individual photon only one is selected (in a probabilistic way according to the Copenhagen interpretation). Even if the observer does not know which, in the end, they all contributes to the observed interference pattern where possible trajectories are integrated.

Quantum systems

Many physical implementations of QC are being developed in atomic physics, quantum optics, nuclear and electron magnetic resonance spectroscopy, superconducting electronics and quantum-dot physics. Based on these various physical systems, one may distinguish 3 main types of systems: 2 versions of quantum computers and the quantum simulators.

Quantum computers

  1. In the “universal approach” represented by the IBM 5-qubit, a quantum register of n-qubits (here n=5) is initialized to represent the problem to be solved. Then a sequence of various actions is performed on each qubit or pair of qubits of the register. When the full sequence of actions (the algorithm) has been applied the register state is collapsed to reveal the result of the computation.
IBM Helium fridge feed-troughs for the 5-qubit chip.
IBM-Jay Gambetta
IBM Jay Gambetta

In IBM chip, each qubit is a superconducting quantum interference device (squid) operating at 20 mK.

The difficulty to setup the hardware for a large and stable set of qubits has been (and still is) a serious roadblock. Qubit are extremely sensitive to even the slightest perturbations and error corrections must also be implemented. In fact quantum computing would be impossible without error correction. A major breakthrough was achieved last year by an IBM team who demonstrated ” a quantum error detection code using a square lattice of four superconducting qubits” Nature Communications.

 A 5-qubit chip has now been made available (IBM Research Frontier Institute) for free to the quantum computing community through a cloud interface. In principle the system can be scaled up to more qubits, but large scaling may prove even here to be difficult.

A true universal quantum computer would require somewhere between a million to 100 million qubits, and that could take decades to build”, Jay Gambetta (IBM)

However IBM have shorter term ambitions:

A universal quantum computer does not exist today, but IBM envisions medium-sized quantum processors of 50-100 qubits to be possible in the next decade. With a quantum computer built of just 50 qubits, none of today’s TOP500 supercomputers could successfully emulate it, reflecting the tremendous potential of this technology.

2. The other approach to QC is based on quantum annealing, as developed by the D-Wave company, where a larger number of qubits, some strongly, other weakly entangled represent a function possibly bearing multiple minima. Following an adiabatic process on the qubits and thanks to quantum tunneling effects, the function reach its true minimum. Or as advertised by D-Wave:

A useful analogy is to think of a landscape with mountains and valleys.
Solving optimization problems can be thought of as trying to find the lowest point on this landscape. Every possible solution is mapped to coordinates on the landscape, and the altitude of the landscape is the “energy’” or “cost” of the solution at that point. The aim is to find the lowest point on the map and read the coordinates, as this gives the lowest energy, or optimal solution to the problem. (D:wave)

d_wave_quantum_computer
D:wave 512 qubit of a previous model
D-Wave Two in Lab
D:wave 2x
The 2015 D:wave 2x used by Google contains  1000+ qubit. But all qubits are not fully entangled and this system is kind of  controversial. Although quantum effects are certainly at work (quantum tunneling), some questions its true efficiency. Not only is it limited to a subset of universal QC  applications but, even for them, the efficiency remains small, if the right algorithm is selected on the conventional computer serving as reference. The search for those specific applications is still in its early days.

Quantum simulators

Quantum simulators” are analog specific purpose quantum setup built to simulate other too complex or less accessible quantum systems to be studied directly. They cannot really be considered as “quantum computers” but they provide new tools to understand and evaluate complex quantum system that would be impossible with a classical computer.

What QC and “quantum simulators” are good for ?

As we have seen, on the hardware side, a great deal of new platforms are now available and currently are being scaled up in term of number of qubits. But for what applications ?

The Quantum Algorithm zoo  lists most of the quantum algorithms developed these last decades under three main sections.

The first two list applications possible on conventional computers but often intractable when the combinatoric is large also called NP2 or NP-complete problems for the hardest one. Once the solution has been found by QC, it can be easily checked by classical computers

Algebraic and number theoretic algorithms (13 algo.)
It includes one of the most advertised QC application, integer factorization of large prime number products, the base of modern cryptography. Peter Shor’s breakthrough discovery of a polynomial time quantum algorithm using quantum Fourier transform has open the way to a renewed interest in QC. The gain in time complexity is \mathcal{O}(n^3) compared to \mathcal{O}(n^{1/3}) for a classical algorithm (for n=500, the gain in computing time is huge: 10^{42} for QC).
Other algorithms are dealing  with sub-set sums, decoding or constraint satisfaction.

Oracular Algorithms (ask an Oracle)(33 algo.)
The most famous is searching large un-indexed and unsorted databases using the Grover’s algorithm. For n database items, the gain in time complexity here is from \mathcal{O}(\sqrt{n}) to \mathcal{O}(n) in the classical case. HEP databases are usually indexed, at least for the most used keywords. For these data, QC would be of little help if any.
Other applications of possible interest for HEP include optimization, pattern matching (track finding), linear systems, least-squares curve-fitting, graph theory, integral evaluation and machine learning.

The third are “exact” quantum simulations only possible on quantum computers or quantum simulators also called “beyond NP” or “quantum supremacy”. Their results cannot even be checked by classical computer, but only by other QC (for more details see Quantum simulation).

Approximation and Simulation Algorithms (11 algo.)

Applications in quantum chemistry, materials science, nuclear and particle physics have been demonstrated for low energy phenomena in multi-body assemblies.

This is what R. Feynman was most interested in. He was looking for “the possibility that there is to be an exact simulation, that the computer will do exactly the same as Nature“, namely, exactly the same as the quantum behavior of nature “because Nature isn’t classical, dammit !“.

But the way to reach this goal is not yet established and more work is needed as quoted here.

The exponential complexity of classically simulating quantum systems led Feynman to first propose that quantum computers might outperform classical computers on certain tasks. Although the problem of finding ground energies of local Hamiltonians is QMA3-complete and therefore probably requires exponential time on a quantum computer in the worst case, quantum algorithms have been developed to approximate ground and thermal states for some classes of Hamiltonians

However this is certainly where Physics can benefit most from these studies. Frontiers Physics can be split in 3 domains: long distance (astro, cosmology, ) short distance (nuclear and atomic physics, HEP) and complexity (entanglement, quantum systems and computing, …)

 This will be addressed in the second part of this post. 

Further reading:

  1. from: D:wave
  2. nondeterministic polynomial time
  3. Quantum equivalent of the NP

Leave a Reply

Your email address will not be published. Required fields are marked *