**Numerical mathematics maintains close links with theory. Going from one to the other is a daily sport for some mathematicians. Such mathematicians often focus on problems arising from applications: coupling of processes, model validation, parameterizations, high-performance computing (supercomputing) ... Forming a bridge between the abstract and the concrete, numerical mathematics has difficulties in finding its place in the world of research in France.**

Numerical mathematics focuses on solving problems from other disciplines (see The point of view of computational scientists). Thus problems often mentioned in the section Mathematics in the real world are also found in this section but addressed from a different viewpoint. The role of mathematics here is not only to make continuous theory more useful, but also to better define concrete problems by anchoring them in mathematical formalism. In this chapter we will see why this discipline “between disciplines” is a discipline in itself, by reviewing a number of different themes: numerical methods, data analysis, integrated systems approach.

The characteristics of models depend a lot on applications. Thus, while **ocean-atmosphere modelling*** *is characterised by constraints related to stratification, rotation or the turbulent character of flows (direct and inverse cascades of energy), the **modelling of shallow flows *** *often tends to be constrained by physical effects still poorly described by models, such as erosion mechanisms, and numerous uncertain parameters, such as topography or the coefficients of the rheological laws. Models must evolve as knowledge about applications progresses. The advent of supercomputers, however, has naturally led to a race to improve resolution. The result is that numerous geophysical numerical models are now used outside their original field of validity or with inadequate forcings. Thus, at finer resolutions, certain effects such as non-hydrostatic effects can no longer be discounted, and the simplifying hypotheses of the models need to be re-examined. Modifying these hypotheses has a direct impact on the numerical methods to be used (different conservation properties and/or stability constraints). It should not be forgotten, however, that additional computing means could also be used to increase the complexity of numerical methods or to evolve towards statistical approaches!

The choice of discretization methods has an important impact on the physical solution of the models. For example, to manage the singularities at the Earth’s poles in climate models, new approaches are considered, like the unstructured and icosahedral meshes or the systems of hybrid coordinates based on ALE (Arbitrary Lagrangian Eulerian) methods. To evaluate their potential, it is important to set up a systematic development of inter-comparison exercises in cases that range from the ideal to the operational so as to rationalise the different physical and numerical choices.

Statistics are used when questions are asked about data sets (observations and models), estimating sizes or parameters, comparisons, validation of scientific hypotheses, or comparing models ^{1} and measurements. Here are a few of the new **trends and new challenges in statistics for climate, the environment and ecology**, or for the **evolution of population genomics**.

**From data rarity to data abundance**. Current research is addressing, among other things, the selection of relevant covariables, the testing of scientific hypotheses, and multi-dependency due to heterogeneity. Thus, the abundance of data is leading to a need to refine the techniques used to extract available information, particularly by basing this in part on less abundant but more standardised data (see Must we follow the "Big data" method?). For this development, methodological reflection is needed on the optimality of dimension reduction techniques. In addition, there is a real need for more discussion and exchanges between computational scientists and statisticians.

**From the local to the regional or global scale.** Multi-scale analysis is essential in climate and population studies and calls for research, for example, on the definition of relevant models of spatio-temporal and multivariate random fields. Another challenge involves getting the different scales and levels of organisation to communicate. The hierarchical modelling generally used is creating numerous needs with respect to inference algorithms and efficient and rapid simulations.

**Combining models and observations.** Recent developments in so-called **data assimilation ** methods concern improving the way in which nonlinearities are taken into account and adapting methods to hierarchies, model coupling, and multi-scale observations (images, Langrangian data). We are aware of the need to enhance theoretical understanding and are encouraging closer collaboration with optimisation specialists. Finally, with a view to accelerating the implementation of **territorial planning models*** *and also to improve the models, data assimilation methods are being studied to estimate physical parameters and external forcings.

**Validation and decision-making in the uncertain.** Two strategies may be envisaged in parallel. On the one hand, there is model simplification or reduction (simplified and interpretable models, sensitivity analysis, meta-models) that must provide the error between initial model and approximate model. On the other, a multi-model approach needs to be set up with a comparison and **quantification of uncertainties*** *of the different models, possibly by mixing determinism and stochastics (See * Mathematics in the real world*). Thus, sensitivity analysis and quantification methods must be able to manage a large number of strongly correlated parameters, take into account any physical constraints, and manage interactions and feedback between models.

**Coupling and interactions.** Boundary conditions or conditions at the **interfaces** have a strong impact on physical, numerical and mathematical quality as well as on the feasibility of the overall system. It is important to reflect on different strategies, take into account both intrusive and non-intrusive methods, and show interest in global-in-time methods limiting interactions between different processors and thus optimising the performance of **computations on complex architectures**. In geophysics, packages like AGRIF (*Adaptive Grid Refinement In Fortran*) or OASIS (*Ocean, Atmosphere, Sea Ice, Snowpack coupler*) are very popular. These approaches should be generalised! Finally, an integrated approach to system Earth cannot ignore man’s impact and the way he adapts to change. These interactions, with complex feedback, call for better communication between applied mathematics, geosciences and human and social sciences in order to link **decisional mathematics ** and **sustainable development***. *An integrated vision of a system also requires setting up permanent posts (engineers, administrative personnel) to manage interaction between researchers and the promotion and exploitation of research findings (see Working together: time and means and Promoting work at the interface).