Sunway TaihuLight (ST) is the new No. 1 system of the June 2016 TOP500 list of the most powerful supercomputers in the world.
The exascale (1000 Pflop/s)1 supercomputer race is roaring, but does HEP need supercomputers?
Quantum Computing (QC) is now an “old” research thread initiated back in the 80’s. It has long been a mere academic endeavor with only a faint hope to become of any practical use, at least in the short term. But, today, with Google, Microsoft, IBM and others entering the game, the field has attracted much publicity and recent technical progress, like the D:Wave 2x 1000-qubit or the 5-qubit IBM chip have raised new expectations. Europe also is joining the trend planning a giant billion € quantum technologies project (see here and the quantum manifesto).
Will High-Energy Physics (HEP) research benefit from this budding but highly disruptive technology? After all, it was Richard Feynman, famous for inventing a graphical method to compute particle interactions (today at the heart of HEP simulation) who, in 1981, proposed a basic QC model to evaluate quantum processes to a precision unattainable on classical computers.
Would the highest precision needed both to match the coming collider experimental measurements and to probe possible new physics be provided by QC as in Feynman’s dream? Would the more mundane QC algorithms like large number factorization or ultra-fast database search find applications in the simulation or data analysis of HEP frontier experiments?