I am a research scientist at DeepMind, that I have joined at the end of 2020. I work in Paris office.
I was a post-doctoral research at École Normale Supérieure, Paris, in Gabriel Peyré’s lab. I hold a Ph.D. in machine learning, prepared in Inria Parietal, from 2015 to 2018.
I am currently interested in optimization and large-scale deep-learning, and continue to have interest for structured prediction, optimal transport and game theory.
My Ph.D. was obtained under the supervision of Gaël Varoquaux, Julien Mairal and Bertrand Thirion. I developed new stochastic algorithms and multi-task models for terabyte sized fMRI dataset analysis.
- 11/20 I have joined DeepMind as a Research Scientist
- 09/20 Our work on estimating optimal transport distances from streams of samples was accepted at NeurIPS 2020.
- 09/20 Our work on estimating mixed Nash equilibria with continuous strategies was accepted at NeurIPS 2020.
- 03/20-06/20 I am working with Inria and AP-HP to help processing data of the Covid-19 pandemic in Paris region.
- 02/20 Our work on accelerating Nash equilibrium finding, with applications to GANs, was accepted at ICML 2020.
- 05/19 Our paper Geometric Losses for Distributional Learning was accepted at ICML 2019 (Paper) (Slides) (Poster)
- 03/19 I am visiting Joan Bruna at NYU CIMS for four months to observe the nouvelles lunes.
- 01/19 Slides for Differentiable Dynamic Programming for Stuctured Prediction and Attention (Paper).
- 11/18 I started working as a post-doctoral researcher with Gabriel Peyré, in ENS, Paris
- 09/18 I defended my PhD thesis (Slides, Thesis) at NeuroSpin.
- S. Jelassi, C. Domingo Enrich, D. Scieur, A. Mensch, and J. Bruna, “Extrapolation with player sampling for provable fast convergenc in n-player games,” May 2019. PDF
- A. Mensch, M. Blondel, and G. Peyré, “Geometric losses for distributional learning,” in Proceedings of the International Conference in Machine Learning, Feb. 2019. PDF
- A. Mensch, J. Mairal, B. Thirion, and G. Varoquaux, “Extracting universal representations of cognition across brain-imaging studies,” Sep. 2018. PDF
- A. Mensch and M. Blondel, “Differentiable dynamic programming for structured prediction and attention,” in Proceedings of the International Conference in Machine Learning, Jul. 2018, pp. 3462–3471. PDF
- A. Mensch, J. Mairal, B. Thirion, and G. Varoquaux, “Stochastic subsampling for factorizing huge matrices,” IEEE Transactions on Signal Processing, vol. 66, no. 1, pp. 113–128, Jan. 2018. PDF
- A. Mensch, “Learning representations from functional MRI data,” PhD thesis, Université Paris-Saclay, 2018. PDF
- A. Mensch, J. Mairal, D. Bzdok, B. Thirion, and G. Varoquaux, “Learning neural representations of human cognition across many fMRI studies,” in Advances in Neural Information Processing Systems, Dec. 2017, pp. 5885–5895. PDF
- E. Dohmatob, A. Mensch, G. Varoquaux, and B. Thirion, “Learning brain regions via large-scale online structured sparse dictionary learning,” in Advances in Neural Information Processing Systems, Dec. 2016, pp. 4610–4618. PDF
- A. Mensch, J. Mairal, B. Thirion, and G. Varoquaux, “Dictionary learning for massive matrix factorization,” in Proceedings of the International Conference on Machine Learning, Jun. 2016, pp. 1737–1746. PDF
- A. Mensch, G. Varoquaux, and B. Thirion, “Compressed online dictionary learning for fast resting-state fMRI decomposition,” in IEEE International Symposium on Biomedical Imaging, Apr. 2016. PDF
- A. Mensch, E. Piuze, L. Lehnert, A. J. Bakermans, J. Sporring, G. J. Strijkers, and K. Siddiqi, “Connection forms for beating the heart,” in MICCAI Workshop on Statistical Atlases and Computational Models of the Heart, Sep. 2014.
- Geometric losses for distributional learning Paper
- June 2019: ICML (Long Beach, USA) Slides
- Differentiable Dynamic Programming for Structured Prediction and Attention Paper
- April 2019: Courant Institute, NYU (New York, Paris) Slides, hosted by Joan Bruna
- January 2019: Google (Zürich, Switzerland) Slides
- July 2018: ICML (Stockholm, Sweden) Slides
- April 2018: Facebook Artificial Intelligence Research (Paris, France) Slides
- March 2018: Deep Learning Meetup (Paris, France) Slides
- Learning representations from functional MRI Data Thesis
- November 2018: Laplace Seminar, ENS, Paris, hosted by Nicolas Keriven Slides
- September 2018: PhD defense Slides
- Stochastic Subsampling for Factorizing Huge Matrices Paper
- July 2018: ISMP 2018 (Bordeaux, France), invited by Robert Gower Slides
- April 2018: Aix-Marseille Université, invited by Caroline Chaux-Moulin Slides
- January 2018: ENS Ulm (Paris, France), invited by Florent Krzakala Slides
- Learning Neural Representations of Human Cognition across Many fMRI Studies
- November 2017: ATR (Kyoto, Japan), invited by Okito Yamashita.
- Massive Matrix Factorization for Resting-State fMRI Decomposition
- July 2017: ISI (Marrakech, Morocco), invited by David Degras. Slides
- June 2016: ICML (New York, USA). Video / SlideShare / Slides
- Massive Matrix Factorization : Application to collaborative filtering
- October 2016: RecSys FR (Paris, France). Video / SlideShare / Slides
- ICML 2019: Geometric losses for distributional learning and multiplications
- ICML 2018: Differentiable Dynamic Programming for Structured Prediction and Attention
- OPT 2017, SMAI-MODE 2018: Stochastic Subsampling for Factorizing Huge Matrices
- ICML 2016: Dictionary Learning for Massive Matrix Factorization
- ISBI 2016: Compressed Online Dictionary Learning for Fast fMRI Decomposition
- OHBM 2016: Compressed Online Dictionary Learning for Fast fMRI Decomposition
I maintain modl, a package that proposes fast algorithms for sparse and dense matrix factorization, with a scikit-learn compatible API.
I maintain cogspaces, a package that allows to do multi-study decoding in functional MRI. This new analysis approach uses brain-mind associations from many fMRI sources to extract functional networks relevant for mental state prediction.
I am a regular contributer to scikit-learn library, a widely used Python library for general machine learning.
I also contribute to nilearn, a python library that leverages scikit-learn to perform analysis in neuro-imagery.
Deep learning (M2 Data Science - Polytechnique / Paris-Saclay)Permalink
- Slides + Labs of the course I was a TA for.
- I gave a three notebook tutorial on Pytorch.
Numerical analysis (ENSAE 1A)
- Python notebooks with (some) corrected exercices are available in this repository – either use git to check it out or download it as a zip file.
- I highly recommend looking at the Python notebooks from Numerical Tours of Data Sciences for those of you who want to dig further into optimization/data science/graphics.
- Scipy Lecture Notes is an excellent resource for improving your skills in numpy and matplotlib, among other useful libraries for numerical analysis and data science.
- If you want to work from your laptop, I recommend that you install Anaconda for your OS.
- If you do not like the spyder IDE, you should try directly working from Jupyter notebooks: you may prefer this workflow.