The next ACAT will be held in Saas-Fee, one of the greatest ski resort in Switzerland.
From March 11 to 15, 2019, you are invited to attend and contribute to the mornings and evenings sessions of the workshop. The afternoons will be free for informal discussions or for the ski and mountain lovers.
ACAT 2019 is foreseen to be a landmark in the series as we stand at a dramatic moment in the history of computing and physics research. At a time where the advances in AI, deep learning (DL), quantum computing (QC), high performance computing (HPC) as well as dedicated chips (GPU, TPU, neuromorphic), separately as well as in combination, will change drastically the way physics research is done.
This ACAT workshop to be held August 21-25 in Seattle remembers its origin back in 1990 when it was called AIHENP (Artificial Intelligence in High Energy and Nuclear Physics). It has been extended to other research fields and other advanced computing topics. But this year a strong emphasis is put on AI again.
Still time to register here join us …
A great program with great speakers, many plenaries, 3 parallel tracks and 2 round-tables (detailed program)
Sunway TaihuLight (ST) is the new No. 1 system of the June 2016 TOP500 list of the most powerful supercomputers in the world.
The exascale (1000 Pflop/s)1 supercomputer race is roaring, but does HEP need supercomputers?
Quantum Computing (QC) is now an “old” research thread initiated back in the 80’s. It has long been a mere academic endeavor with only a faint hope to become of any practical use, at least in the short term. But, today, with Google, Microsoft, IBM and others entering the game, the field has attracted much publicity and recent technical progress, like the D:Wave 2x 1000-qubit or the 5-qubit IBM chip have raised new expectations. Europe also is joining the trend planning a giant billion € quantum technologies project (see here and the quantum manifesto).
Will High-Energy Physics (HEP) research benefit from this budding but highly disruptive technology? After all, it was Richard Feynman, famous for inventing a graphical method to compute particle interactions (today at the heart of HEP simulation) who, in 1981, proposed a basic QC model to evaluate quantum processes to a precision unattainable on classical computers.
Would the highest precision needed both to match the coming collider experimental measurements and to probe possible new physics be provided by QC as in Feynman’s dream? Would the more mundane QC algorithms like large number factorization or ultra-fast database search find applications in the simulation or data analysis of HEP frontier experiments?