Hello! I am an Assistant Professor at Université Paris Cité and LPSM.

Previously, I was a postdoctoral researcher in the team DAO, at LJK, Université Grenoble Alpes where I worked on statistical aspects of Wasserstein distributionally robust models. I completed my PhD (October 2023) at Toulouse School of Economics (TSE) and ANITI, supervized by Jérôme Bolte and Edouard Pauwels. I worked on stochastic, nonsmooth and nonconvex optimization in machine learning. Some questions I studied there included convergence guarantees when using automatic differentiation in stochastic algorithms (e.g. SGD in deep learning), and nonsmooth implicit differentiation applied to optimization layers and hyperparameter selection.

[CV (fr)].

## Pre-prints

- Universal Generalization Guarantees for Wasserstein Distributionally Robust Models, T. Le and J. Malick, under review [preprint].
- Inexact subgradient methods for semialgebraic functions, J. Bolte, T. Le, E. Moulines and E. Pauwels, [preprint].

## Publications

- Nonsmooth Implicit Differentiation for Machine Learning and Optimization J. Bolte, T. Le, E. Pauwels, A. Silveti-Falls, NeurIPS 2021. [paper]
- Subgradient sampling for nonsmooth nonconvex minimization, J. Bolte, T. Le, E. Pauwels, SIAM Journal on Optimization 2023 [paper]
- Nonsmooth nonconvex stochastic heavy ball, to appear in Journal of Optimization Theory and Applications 2024 [paper]

## Thesis

Nonsmooth calculus and optimization for machine learning: first-order sampling and implicit differentiation, T. Le, PhD Thesis, 2023. Advised by Jérôme Bolte and Edouard Pauwels. [manuscript] [slides]

Awarded the PGMO PhD Award 2024!

## Communications

** Generalization guarantees of Wasserstein robust models

- LAMSADE-MILES seminar - Université Paris Dauphine, Paris (talk), 2024
- Journées SMAI-MODE Lyon (talk), 2024

** Nonsmooth nonconvex stochastic heavy ball, Mathematical Optimization research seminar, University of Tübingen (online talk), 2024.

** Nonsmooth implicit differentiation in machine learning and optimization

- ANITI-PRAIRIE workshop, Toulouse (poster), 2023.
- Neurips (online poster), Neurips Paris event (poster), 2021.
- Stat-Eco-ML seminar, CREST (talk), 2021.

** Subgradient sampling in nonconvex minimization (talks).

- EUROPT, Budapest 2023.
- SIAM Conference on optimization, Seattle 2023.
- PGMO Days, Paris 2022.
- GdR MOA Days, Nice 2022.
- Mathematical Optimization research seminar, University of Tübingen (online), 2022.
- ICCOPT, Bethlehem (Pennsylvania) 2022.
- French-German days Inria, Le Chesnay-Rocquencourt 2021.
- Toulouse School of Economics, PhD students seminar.

## Teaching

I gave several tutorials at Université Toulouse 1 Capitole and TSE (64 H / year):

** 2022

- R for data science and statistics (M1 Data science for social sciences)
- Optimization for big data (M1 Data science for social sciences)
- PyTorch tutorial for Deep Learning (M2 Data science for social sciences)
- Optimization (L3 Economics)

** 2021

- Mathematics for Economics and Management, (L1 Economics and Management)
- Mathematics, Undergraduate (L1 Economics and Mathematics)
- Analysis and Optimization, (L3 Economics and Mathematics)

** 2020

- Support course in mathematics (L1)
- Descriptive statistics (L1)
- Mathematics for Management(L1)

## Reviewer

I served as a reviewer for AISTATS (2023), SIAM Journal on optimization and Mathematical programming.

## Education

- Ph.D. in Applied Mathematics, Toulouse School of Economics, 2020 - 2023
- MSc in Machine Learning and Computer Vision, ENS Paris-Saclay, 2019 - 2020
- MSc in Statistics and Machine Learning, ENSAE Paris 2017 - 2020