François Lanusse

Cosmologist / Astrostatistician @ CNRS, member of the CosmoStat Laboratory near Paris, France.

I am a CNRS researcher working at the intersection between Deep Learning, Statistical Modeling, and Observational Cosmology. I am particularly interested in combining tools and methodologies from the field of Machine Learning (automatic differentiation, deep generative models, score-based high-dimensional inference) with physical modeling for the analysis of Stage IV Cosmological Surveys.

I am currently co-Chair of the LSST Informatics & Statistics Science Collaboration (LSST ISSC) and actively involved in the LSST Dark Energy Science Collaboration (LSST DESC).

Before my CNRS position, I was a postdoctoral fellow at the Berkeley Center for Cosmological Physics (BCCP) and with the Foundation of Data Analysis (FODA) institute at UC Berkeley, working with Prof. Uroš Seljak. Before that, I was a postdoctoral researcher in the McWilliams Center for Cosmology at Carnegie Mellon University, where I was working with Prof. Rachel Mandelbaum on weak gravitational lensing measurements and systematics, and interacted with both Statistics and Machine Learning departments while at CMU.

news

Jan 30, 2023

Open PhD position at CEA Saclay to work with me on using generative modeling and autodiff to measure cosmic shear from space 🛰️. Application deadline March 10th.

Dec 12, 2022

Very cool project :sunglasses: led by Yesukhei Jagvaral in collaboration with Rachel Mandelbaum on developing diffusion models for the SO3 Lie group to model galaxy intrinsic orientations in simulations. Full ML paper on this coming soon.

Nov 26, 2022

Pre-release of jaxDecomp, a JAX library providing bindings to NVIDIA’s cuDecomp adaptive pencil decomposition library for efficient multi-node/multi-gpu 3D FFTs and halo exchanges.

Nov 15, 2022

Multiple papers accepted in the Machine Learning and the Physical Sciences Workshop at NeurIPS 2022 :tada:

Nov 7, 2022

New astro paper on Differentiable Halo Occupation Distributions led by Ben Horowitz demonstrating that it’s not because your model is stochastic and involves discrete variables that you can’t backprop through it :wink:

selected publications

  1. Differentiable Stochastic Halo Occupation Distribution
    Benjamin Horowitz, ChangHoon Hahn, Francois Lanusse, and 2 more authors
    arXiv e-prints Nov 2022
  2. The DAWES review 10: The impact of deep learning for the analysis of galaxy surveys
    Marc Huertas-Company, and François Lanusse
    arXiv e-prints Oct 2022
  3. Bayesian uncertainty quantification for machine-learned models in physics
    Yarin Gal, Petros Koumoutsakos, Francois Lanusse, and 2 more authors
    Nature Reviews Physics Aug 2022
  4. Neural Posterior Estimation with Differentiable Simulators
    Justine Zeghal, François Lanusse, Alexandre Boucaud, and 2 more authors
    arXiv e-prints Jul 2022
  5. Probabilistic Mass Mapping with Neural Score Estimation
    Benjamin Remy, Francois Lanusse, Niall Jeffrey, and 4 more authors
    arXiv e-prints Jan 2022