Uncertainty analysis methodologies in academic and industrial studies require new competences, different from simulation code development.
Numerical methods need to integrate model errors beyond the traditional uncertainty parameter studies.
Tools and middleware will facilitate a wider usage of uncertainty methodologies.
Additionally, efficient parallelisation require multiple levels of parallelism fo which new developing tools are needed.
It is important to adopt uncertainty analysis in academic and industrial studies. The use of the uncertainty analysis methodologies requires competence that is somewhat different from the ones required to develop a simulation code, and a key issue is that of training.
Investment is also required in numerical methods. In order to deploy uncertainty analysis on highly CPU-consuming codes, two strategies should be followed: improving adaptive designs of experiments and progressing on surrogate models.
Furthermore, traditional uncertainty analysis deals mostly with parameter uncertainty, a huge progress for the validation of scientific codes would be achieved by also taking into account model errors.
The software tools are very important for facilitating the uncertainty analysis dissemination in the numerical simulation community. Investment on tools and middleware taking into account the problem of resilience to failures is important to make more robust uncertainty tools and therefore to facilitate their wider usage.
Last, modern multiphysics computations involve multiple levels of parallelism (domain decomposition, code coupling, multiscale, etc.). Support developing tools that ensure these different levels of parallelism should be combined with the ones related to the design of experiments for efficient parallelisation of the ensemble.