Efficient algorithms and Data Assimilation for Computationally Efficient Constrained Advanced Learning

Chair objectives

The objective of the chair is to promote a synergy between data assimilation and machine learning to study new algorithms as well as their efficient implementation on modern computer architectures. With regard to AI, our goal is to study methods of introducing physical constraints into machine learning algorithms, and to prove both theoretically and practically their performance.

Program Acceptable AI
Themes: Safe design and embeddability | Data & anomaly | AI & physical models | Optimization and game theory for AI

Chair holder: Serge Gratton (Toulouse INP – IRIT)

Collaborators
Pierre Boudier (ANITI) 
Alfredo Buttari (CNRS) 
Selime  Gürol (Cerfacs) 
C. Lapeyre (Cerfacs)

Website
http://gratton.perso.enseeiht.fr/

Data Assimilation (DA) is an ubiquitous tool for making predictions in complex systems, whether they are large-scale systems such as those derived from earth observation or systems with few degrees of freedom, such as autonomous ground or space vehicles. Performance in prediction accuracy must be obtained by using as sparingly as possible the available computing resources, whether they are hosted in computing centers or embedded in autonomous systems.

It is important to note the central difference with Machine Learning (ML): we are talking about spatial and temporal prediction that take advantage of the knowledge of time dependent physical models. The question could be rephrased in this way: how can ML improve the performance obtained by the well-established technique of DA, both in accuracy and in the use of computational resources (High Performance Computing)

Data assimilation involves making predictions from evolutionary equations that are fitted using observational data. This discipline has grown enormously in recent times, notably relying on the use of statistical sampling techniques.

This emergence has been stimulated by strong societal needs in advanced applications such as meteorology and geosciences in general, but also other sectors such as neutronics, mechanics to name a few. It is also observed that the mathematical complexity of modern evolution models increases such that monitoring systems in real time is a challenge in many applications. In this regard, the important questions to be resolved relate to a good management of the non-linearity of the models, to the attenuation of the linear and Gaussian assumptions which are at the heart of the most recent algorithms, and to the efficient use of calculation resources.

Backed by a very general statistical theory, deep learning algorithms make it possible to make predictions on a wide variety of data sets and achieve impressive performance, both in terms of precision and execution time. In many applications such as image and speech processing, for example. The key is the expressiveness of deep neural networks, which makes it possible to address very nonlinear functions, the use of rich databases, and the implementation of algorithms on computational architectures dedicated to these prediction tasks.

Know more

Artificial intelligence to boost forecasts of all kinds.

What is the relationship between data assimilation, weather forecasting and artificial intelligence?

What are the links between oceanography, a prediction confidence index and machine learning?

Read the article in French

Ne manquez rien !

Inscrivez-vous pour recevoir l'actualité d'ANITI chaque mois.

Nous n’envoyons pas de messages indésirables !

en_GBEnglish