Recent Publications
A two-head loss function for deep Average-K classification
C. Garcin, M. Servajean, A. Joly and J. Salmon (2023).
[www]
Supervised learning of analysis-sparsity priors with automatic differentiation
H. Ghanem, J. Salmon, N. Keriven and S. Vaiter (2023).
IEEE Signal Process. Lett.
.
Local linear convergence of proximal coordinate descent
algorithm
Q. Klopfenstein, Q. Bertrand, A. Gramfort, J. Salmon and S. Vaiter (2023).
Optimization Letter
.
Peerannot: classification for crowd-sourced image datasets with Python
T. Lefort, B. Charlier, A. Joly and J. Salmon (2023).
[www]
High-Dimensional Private Empirical Risk Minimization by Greedy Coordinate Descent
P. Mangold, A. Bellet, J. Salmon and M. Tommasi (2023).
AISTATS
[www]
Implicit differentiation for fast hyperparameter selection in non-smooth convex learning
Q. Bertrand, Q. Klopfenstein, M. Massias, M. Blondel, S. Vaiter, A. Gramfort and J. Salmon (2022).
J. Mach. Learn. Res.
23: (1--43).
Spatially relaxed inference on high-dimensional linear models
J.-A. Chevalier, T. B. Nguyen, B. Thirion and J. Salmon (2022).
Statistics and Computing
32: (1--15).
Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification
C. Garcin, M. Servajean, A. Joly and J. Salmon (2022).
ICML
[Code]
Identify ambiguous tasks combining crowdsourced labels by weighting Areas Under the Margin
T. Lefort, B. Charlier, A. Joly and J. Salmon (2022).
Differentially Private Coordinate Descent for Composite Empirical Risk Minimization
P. Mangold, A. Bellet, J. Salmon and M. Tommasi (2022).
ICML
Full list of
publications
Contact
Email: joseph"dot"salmon "dot"taff@gmail"dot"com
Address:
IMAG, c.c. 051
Université de Montpellier
Place Eugène Bataillon
34095 Montpellier Cedex 5
(office 415, building 9)
News
- April 2023: Organizer of STATLEARN 2023.
- November 2022: Organizer of "https://ml4lifesciences.sciencesconf.org/" (November 15-17)
- September 2022: Benchopt paper (url) accepted at NeurIPS 2022!
- March 2022: visitor at the Simons Institute for the Theory of Computing
- August 2021: New dataset available! This is a subset of the Pl@ntNet database https://plantnet.org/, with more than 300k images and thousands of classes (plant species)
https://github.com/plantnet/PlantNet-300K. See the Neurips, Datasets and Benchmarks Track paper pdf.
- July 2021: IUF Nomination (junior member): https://www.iufrance.fr/detail-de-lactualite/247.html
- November 2020: Launching ML-MTP Machine Learning in Montpellier, Theory & Practice.
- October 2020: talk at GdR MIA Thematic day on Non-Convex Sparse Optimization (Toulouse, France) "Screening Rules for Lasso with Non-Convex Sparse Regularizers" slides and a bit on
BenchOpt, a new package to simplify, make more transparent and more reproducible comparisons between optimization algorithms
- June 2020: New paper accepted to ICML "Implicit differentiation of Lasso-type models for hyperparameter optimization" with Q. Bertrand, Q. Klopfenstein, M. Blondel, S. Vaiter and A. Gramfort, code
- January 2020: New paper accepted to AISTATS "Support recovery and sup-norm convergence rates for sparse pivotal estimation" with Q. Bertrand, M. Massias and A. Gramfort, code
- December 2019: The AI chair proposal CaMeLOt (CooperAtive MachinE Leanrning and OpTimization) has been selected by the ANR. Open PhD and postdoc positions will be published soon (location: Montpellier). Do not hesitate to contact me for additional information.
- September 2019: New paper accepted at NeurIPS 2019, joint work with Q. Bertrand, M. Massias and A. Gramfort. "Handling correlated and repeated measurements with the smoothed Multivariate square-root Lasso" arxiv, code
- May 2019 : Workshop in Montpellier Graph signals : learning and optimization perspectives
- "Celer: a Fast Solver for the Lasso with Dual Extrapolation", accepted to ICML2018; code and companion paper called (pdf, slides).
Joint work with Mathurin Massias and Alexandre Gramfort
- classgrade: an open source project for peer grading, mostly developed by C. Marini. In the process of being merged into: Peeramid.
Team
Ph.D. Students
Alumni
- Damien Blanc [Ph.D. 2019-2022],co-supervised by Benjamin Charlier and funded by Quantacell
- Cassio Fraga Dantas Post-doctorate associate: 2022
- Florent Bascou [Ph.D. 2019-2022], co-supervised by Sophie Lèbre,
Manuscript: "Sparse linear model with quadratic interactions"
- Quentin Bertrand [Ph.D. 2018-2021], co-supervised by Alexandre Gramfort (now at Mila),
Manuscript: "Hyperparameter selection for high dimensional sparse learning : application to neuroimaging"
- Nidham Gazagnadou [Ph.D. 2018-2021] co-supervised by Robert Gower (now at Sony AI),
Manuscript: "Expected smoothness for stochastic variance-reduced methods and sketch-and-project methods for structured linear systems"
- Pierre-Antoine Bannier [Intern 2021], co-supervised by Alexandre Gramfort
- Jérôme-Alexis Chevalier [Ph.D. 2017-2020], co-supervised by Bertrand Thirion (Senior Data Scientist at Emerton Data),
Manuscript: "Statistical control of sparse models in high dimension"
- Mathurin Massias [Ph.D. 2016-2019], co-supervised by Alexandre Gramfort (now CR INRIA, Lyon),
Manuscript: "Sparse high dimensional regression in the presence of colored heteroscedastic noise : application to M/EEG source imaging"
- Evgenii Chzhen [Ph.D. 2016-2019], co-supervised by Mohamed Hebiri (now CR CNRS, Saclay),
Manuscript: "Plug-in methods in classification"
- Eugene Ndiaye [Ph.D., 2015-2018], co-supervised by Olivier Fercoq (now post-doctorate at GeorgiaTech),
Manuscript: "Safe optimization algorithms for variable selection and hyperparameter tuning"
- Jean Lafond [Ph.D., 2013-2016] co-supervised by Éric Moulines (now at Cubist Systematic, UK),
Manuscript: "Complétion de matrice : aspects statistiques et computationnels"
- Igor Colin [Ph.D., 2013-2016] co-supervised by Stéphan Clémençon and funded by Streamwide (now at Huawei),
Manuscript: "Adapting machine learning methods to U-statistics"
- Jair Montoya [Post Doc, 2016-2017], co-supervised by Olivier Fercoq
- Thierry Guillemot (now at ARIADNEXT) Engineer, co-supervised by Alexandre Gramfort, 2016
Short Bio
I am a statistician and a applied mathematician, with a strong interest in machine learning, optimization and data science. In terms of applications, I am focusing on citizen science, crowdsourcing and high dimensional statistics.
Since 2018, I am a full professor at Université de Montpellier
and an associate member at INRIA Parietal Team.
For the spring and summer quarters 2018, I was a visiting assistant professor at UW, Statistics departement.
From 2012 to 2018 I was an assistant professor at Telecom ParisTech.
Back in 2011 and 2012, I was a post-doctoral Associate at Duke university working with Rebecca Willett.
In 2010, I finished my Ph.D. in statistics and image processing under the supervision of Dominique Picard and Erwan Le Pennec at the Laboratoire de Probabilités et de Modélisation Aléatoire, now LPSM, in Université Paris Diderot.
Software
-
BenchOpt: package to simplify, make more transparent and more reproducible comparisons between optimization algorithms
-
Celer: a fast Lasso solver, code of the associated ICML2018 paper "Dual Extrapolation for Faster Lasso Solvers" (pdf, slides)
- sparse-ho: a fast hyper-parameter package to select the best Lasso parameter efficiently,
code of the associated ICML2020 paper "Implicit differentiation of Lasso-type models for hyperparameter optimization" (pdf)
- matlab toolboxes for statistics and image processing (this is legacy)
More on my Github Page
Joining? Open positions in my lab
I am always looking for outstanding and highly motivated people to join my team as interns, Ph.D. students, post-doctorates or research engineers in the following areas:
- citizen science and crowdsourcing
- optimization for machine learning (including federated learning, privacy, etc.)
- high dimensional and robust statistics
I always have open positions for outstanding applicants (post-doc, Ph.D. thesis, internship). The application process is light:
- Email me your CV, transcript of most recent grades (for interns and Ph.D. students) and explain in a paragraph why you are interested to join my group.
- After preliminary feedback on my side, I will ask you to secure two reference letters (one is enough for interns or Ph.D. students) to be sent directly to me.
- At this stage an interview (possibly online) will be arranged to double-check your skills and profile compatibility.
Talks
Full list of
talks, with slides and
possibly videos when recorded.
Miscellaneous
Here (
Miscellaneous), you will find
some (math)art and other distractions.