Address: IMAG, c.c. 051 Université de Montpellier Place Eugène Bataillon 34095 Montpellier Cedex 5 (office 415, building 9)
- April 2023: Organizer of STATLEARN 2023.
- November 2022: Organizer of "https://ml4lifesciences.sciencesconf.org/" (November 15-17)
- September 2022: Benchopt paper (url) accepted at NeurIPS 2022!
- March 2022: visitor at the Simons Institute for the Theory of Computing
- August 2021: New dataset available! This is a subset of the Pl@ntNet database https://plantnet.org/, with more than 300k images and thousands of classes (plant species) https://github.com/plantnet/PlantNet-300K. See the Neurips, Datasets and Benchmarks Track paper pdf.
- July 2021: IUF Nomination (junior member): https://www.iufrance.fr/detail-de-lactualite/247.html
- November 2020: Launching ML-MTP Machine Learning in Montpellier, Theory & Practice.
- October 2020: talk at GdR MIA Thematic day on Non-Convex Sparse Optimization (Toulouse, France) "Screening Rules for Lasso with Non-Convex Sparse Regularizers" slides and a bit on BenchOpt, a new package to simplify, make more transparent and more reproducible comparisons between optimization algorithms
- June 2020: New paper accepted to ICML "Implicit differentiation of Lasso-type models for hyperparameter optimization" with Q. Bertrand, Q. Klopfenstein, M. Blondel, S. Vaiter and A. Gramfort, code
- January 2020: New paper accepted to AISTATS "Support recovery and sup-norm convergence rates for sparse pivotal estimation" with Q. Bertrand, M. Massias and A. Gramfort, code
- December 2019: The AI chair proposal CaMeLOt (CooperAtive MachinE Leanrning and OpTimization) has been selected by the ANR. Open PhD and postdoc positions will be published soon (location: Montpellier). Do not hesitate to contact me for additional information.
- September 2019: New paper accepted at NeurIPS 2019, joint work with Q. Bertrand, M. Massias and A. Gramfort. "Handling correlated and repeated measurements with the smoothed Multivariate square-root Lasso" arxiv, code
- May 2019 : Workshop in Montpellier Graph signals : learning and optimization perspectives
- "Celer: a Fast Solver for the Lasso with Dual Extrapolation", accepted to ICML2018; code and companion paper called (pdf, slides). Joint work with Mathurin Massias and Alexandre Gramfort
- classgrade: an open source project for peer grading, mostly developed by C. Marini. In the process of being merged into: Peeramid.
- Antoine Simoes: co-supervised by Yohann de Castro 2022-2025?
- Amélie Vernay: co-supervised by Nicolas Meyer 2022-2025?
- Tanguy Lefort co-supervised with Benjamin Charlier 2021-2024?
- Camille Garcin co-supervised by Alexis Joly and Maximilien Servajean: 2020-2023?
- Emmanuel Pilliat co-supervised by Nicolas Verzelen and Alexandra Carpentier: 2020-2023?
- Hashem Ghanem co-supervised by Samuel Vaiter and Nicolas Keriven: 2020-2023?
- Damien Blanc [Ph.D. 2019-2022],co-supervised by Benjamin Charlier and funded by Quantacell
- Cassio Fraga Dantas Post-doctorate associate: 2022
- Florent Bascou [Ph.D. 2019-2022], co-supervised by Sophie Lèbre, Manuscript: "Sparse linear model with quadratic interactions"
- Quentin Bertrand [Ph.D. 2018-2021], co-supervised by Alexandre Gramfort (now at Mila), Manuscript: "Hyperparameter selection for high dimensional sparse learning : application to neuroimaging"
- Nidham Gazagnadou [Ph.D. 2018-2021] co-supervised by Robert Gower (now at Sony AI), Manuscript: "Expected smoothness for stochastic variance-reduced methods and sketch-and-project methods for structured linear systems"
- Pierre-Antoine Bannier [Intern 2021], co-supervised by Alexandre Gramfort
- Jérôme-Alexis Chevalier [Ph.D. 2017-2020], co-supervised by Bertrand Thirion (Senior Data Scientist at Emerton Data), Manuscript: "Statistical control of sparse models in high dimension"
- Mathurin Massias [Ph.D. 2016-2019], co-supervised by Alexandre Gramfort (now CR INRIA, Lyon), Manuscript: "Sparse high dimensional regression in the presence of colored heteroscedastic noise : application to M/EEG source imaging"
- Evgenii Chzhen [Ph.D. 2016-2019], co-supervised by Mohamed Hebiri (now CR CNRS, Saclay), Manuscript: "Plug-in methods in classification"
- Eugene Ndiaye [Ph.D., 2015-2018], co-supervised by Olivier Fercoq (now post-doctorate at GeorgiaTech), Manuscript: "Safe optimization algorithms for variable selection and hyperparameter tuning"
- Jean Lafond [Ph.D., 2013-2016] co-supervised by Éric Moulines (now at Cubist Systematic, UK), Manuscript: "Complétion de matrice : aspects statistiques et computationnels"
- Igor Colin [Ph.D., 2013-2016] co-supervised by Stéphan Clémençon and funded by Streamwide (now at Huawei), Manuscript: "Adapting machine learning methods to U-statistics"
- Jair Montoya [Post Doc, 2016-2017], co-supervised by Olivier Fercoq
- Thierry Guillemot (now at ARIADNEXT) Engineer, co-supervised by Alexandre Gramfort, 2016
Since 2018, I am a full professor at Université de Montpellier and an associate member at INRIA Parietal Team. For the spring and summer quarters 2018, I was a visiting assistant professor at UW, Statistics departement. From 2012 to 2018 I was an assistant professor at Telecom ParisTech. Back in 2011 and 2012, I was a post-doctoral Associate at Duke university working with Rebecca Willett.
In 2010, I finished my Ph.D. in statistics and image processing under the supervision of Dominique Picard and Erwan Le Pennec at the Laboratoire de Probabilités et de Modélisation Aléatoire, now LPSM, in Université Paris Diderot.
BenchOpt: package to simplify, make more transparent and more reproducible comparisons between optimization algorithms
- sparse-ho: a fast hyper-parameter package to select the best Lasso parameter efficiently, code of the associated ICML2020 paper "Implicit differentiation of Lasso-type models for hyperparameter optimization" (pdf)
- matlab toolboxes for statistics and image processing (this is legacy)
More on my Github Page
Joining? Open positions in my lab
I am always looking for outstanding and highly motivated people to join my team as interns, Ph.D. students, post-doctorates or research engineers in the following areas:
- optimization for machine learning (including federated learning, privacy, etc.)
- high dimensional and robust statistics
I always have open positions for outstanding applicants (post-doc, Ph.D. thesis, internship).
The application process is light:
- Email me your CV, transcript of most recent grades (for interns and Ph.D. students) and explain in a paragraph why you are interested to join my group.
- After preliminary feedback on my side, I will ask you to secure two reference letters (one is enough for interns or Ph.D. students) to be sent directly to me.
- At this stage an interview (possibly online) will be arranged to double-check your skills and profile compatibility.