Fundamental sciences covers nuclear physics, laser- plasma physics, nuclear fusion, quantum chemistry, soft matter physics, materials science and astrophysics/cosmology. Those sciences strongly rely on fast algorithms and efficient software implementations while tracking developments in hardware technology.
There is no clear opinion yet, whether the transition from peta to exascale can be achieved smoothly from current implementations or whether a revolutionary step is necessary.
Challenges include programming models, communication schemes, memory management, algorithms including fault tolerance and energy efficiency.
Europe has a very strong position in the global scientific community and is leading in several fields of astrophysics/cosmology, nuclear/hadron physics, fusion research and materials sciences. Software development is very advanced in those domains.
Exaflop capacities in 2020 means being able to deliver a Petaflop in a box for $200 000 and 20 kW of power consumption. This represent a huge impact for those, academic, industrial, large and small structures, including SMEs, that will be able to take advantage of “exascale” technology, not just for few heroes applications
The needs pose crucial issues for handling very large amounts of data to be processed on the fly. In this area, exascale definitely means, Exaflop and Exabytes. The challenge is twofold, to compute at the speed of 1018 floating point operations per second AND the capacity to manage, explore, extract information and knowledge of data set size of 1018Bytes
Path: Developing multiscale software frameworks would position Europe ahead of competition.
Roadmap: massive simulations in the field of dark matter or collisions between galaxies have been performed by US, Japanese or European research teams.
GYSELA and PEPC fusion codes have been successfully scaled out to the full JUQUEEN machine at Juelich (half a billion cores using 1,835,008 threads)