Avatar2017small
Welcome! I am a post-doc Ecole normale supérieure in Paris, where I work on problems at the interface of theoretical physics and machine learning with Florent Krzakala and Lenka Zdeborová.

The main theme of my current research is to understand why neural networks are able to generalise well from examples in practice, when classical learning theory would predict that they cannot. My approach is to use concepts and tools from statistical physics to build models for the key drivers of generalisation of neural networks. I am also interested in using machine learning as a tool to handle the vast data sets generated by large-scale experiments, in particular in neuroscience; and as a source of novel theoretical ideas, e.g. for the thermodynamics of computation.

On this site, you will some recent talks I have given below; a list of my publications, an overview over my teaching and outreach activities and a brief CV.

Recent News

  • 2019/12 It’s NeurIPS time:
  • 2019/10 One of the things that’s holding back a good theoretical understanding of deep learning is our lack of good models for structured data sets. I had the opportunity to present our proposal with Marc Mézard, Florent Krzakala and Lenka Zdeborová at the Deep Learning Workshop in Princeton (preprint here)

  • 2019/06 I had the chance to talk about our recent work on the dynamics and the performance of neural networks at the “Theoretical Physics For Deep Learning” workshop at this year’s ICML. Here’s a link to the video:
  • Watch the video abstract for our recent article in the New Journal of Physics on the thermodynamic efficiency of learning a rule with neural networks:

Video abstract: Thermodynamic Efficiency of learning a rule with neural networks

Click to play!

  • Or have a look at a short news article on Phys.org about our PRL on the stochastic thermodynamics of Learning.

Brief CV