PhD position: Scalable Training Algorithms for Scientific Machine Learning Applications – September 25, 2024

Research project

Scientific machine learning (SciML) has significantly enhanced traditional numerical methods by streamlining computational modeling and offering cost-effective surrogate models. Despite these advantages, the training phase of SciML surrogates remains computationally expensive, limiting their applicability in real-world, multi-scale, and multi-physics, engineering problems.

This PhD project aims to address this limitation by developing novel training algorithms for SciML surrogates that leverage multilevel techniques and/or domain decomposition methods. The focus will be on designing innovative optimizers, investigating various network decompositions, and implementing efficient parallelization strategies to improve scalability and reduce computational costs.

Scientific environment

The successful candidate will join the international chair HAILSED at ANITI and the IRIT Laboratory (APO team) at ENSEEIHT (Toulouse-INP). The HAILSED chair, focusing on hybridizing AI and large- scale numerical simulations for engineering design, offers valuable opportunities to engage with experts in (scientific) machine learning, applied mathematics, scientific computing, numerical simulations, and high- performance computing (HPC). The candidate will also actively collaborate with researchers from ISAE- SUPAERO and IRT Saint Exup ́ery, enhancing their scientific development and interdisciplinary research profile.

Candidate’s profile

The candidate should possess a Master’s or Engineering degree in computational science, applied mathemat- ics, computer science, informatics, or a related field. Additionally, experience in the following areas is highly beneficial:

• Programming in Python
• Practical deployment of (Sci)ML applications within PyTorch, Tensorflow, JAX, etc. 

• Knowledge of numerical optimization and standard training algorithms
• Knowledge of multilevel and domain-decomposition methods
• HPC programming
• Working proficiency in English.

Related literature

  1. Youngkyu Lee, Alena Kopaniˇc ́akov ́a, and George Em Karniadakis. Two-level Overlapping Additive Schwarz preconditioning for training of scientific machine-learning applications. Under review, 2024. Preprint
  2. Alena Kopaniˇc ́akov ́a, Hardik Kothari, George Karniadakis, and Rolf Krause. Enhancing training of physics-informed neural networks using domain-decomposition based preconditioning strategies. SIAM Journal on Scientific Computing, 2024. Postprint
  3. Serge Gratton, Alena Kopaniˇc ́akov ́a, and Philippe L. Toint. Multilevel objective-function-free optimiza- tion with an application to neural networks training. SIAM Journal on Optimization, 33(4):2772–2800, 2023. Article
  4. Alena Kopaniˇc ́akov ́a and Rolf Krause. Globally Convergent Multilevel Training of Deep Residual Networks. SIAM Journal on Scientific Computing, 0(0):S254–S280, 2022. Article

The application

Interested candidates are required to submit an application that includes the following:

1. A comprehensive CV
2. A motivation letter detailing the applicant’s research interests and reasons for applying

3. Copies of relevant diplomas and certificates
4. Two letters of recommendation from academic referees, who may be contacted for further information

Please send your complete application in one single PDF file to Alena Kopaniˇc ́akov ́a (alena.kopanicakova@toulouse- inp.fr). The call is open until the position is filled.




Ne manquez rien !

Inscrivez-vous pour recevoir l'actualité d'ANITI chaque mois.

Nous n’envoyons pas de messages indésirables !

en_GBEnglish