Rethinking models for High-Performance Computing

Architectures evolve more rapidly than numerical methods and operational codes, which require several years of simulation and validation. However, we are seeing a real turning point with the increase in computing units and massively parallel processors. To be more efficient in terms of computing time, the codes can no longer be adapted simply.

Is substantial modification of the codes or methods inevitable? To answer this question, greater priority must be given to exploratory research: tests on simple methods and models can be used to check the feasibility and contribution of new techniques. From a mathematical point of view, there is a lot to be done on analysing coupled systems, multi-scale analysis, development of new methods, new grids and new algorithms. Numerous deterministic and stochastic models can only be redefined by treating a problem as a whole (maths/IT/application) and requires a communal effort from mathematicians, computational scientists, information technologists, and environmental scientists.

Furthermore, to validate exploratory research it is of course vital to support computing centres and help maintain operational codes.

Scroll to top