Louis Martin

Louis Martin

Research Scientist in Artificial Intelligence

Meta AI

About Me

I am a Research Scientists in Artificial Intelligence working on Natural Language Processing at Meta AI.

I completed my PhD at Facebook AI Research on the subject of text simplification and how to use AI to help people with cognitive disabilities.

I consider myself part of the Effective Altruism (EA) community and I’m very keen on maximizing my positive impact on the world.

Download my detailed resumé.

  • AI Alignment & Safety
  • Natural Language Processing
  • Language Modeling
  • Text Simplification
  • PhD in Artificial Intelligence, 2021

    Facebook AI Research & Inria

  • Master of Science in Mathematics, Vision and Machine Learning (MVA), 2017

    Ecole Normale Supérieure Paris-Saclay

  • Master of Engineering in Applied Mathematics, 2017

    École Centrale Paris


Research Scientist
Apr 2023 – Present Paris, France
  • Research on Large Language Model’s Safety, technical lead.
Research Scientist
Aug 2021 – Present London, UK
  • Applied Research on text understanding to make recommender systems more relevant and interpretable.
  • Lead project on large scale weakly-supervised pretraining of NLP models on social media data.
  • Explore how retrieval augmented models can provide richer text representations.
PhD Student
May 2018 – Aug 2021 Paris, France
  • Created a demo based on my research in Text Simplification that was presented to Mark Zuckerberg. It was also showcased by Facebook’s CTO at the company-wide All Hands conference, in front of 15k employees.
  • My research involved challenging large-scale engineering skills: I trained CamemBERT Language Model in parallel on 256 GPUs on 138GB of training text, I scaled my text mining pipeline using neural semantic embeddings to up to 1 billion sentences by optimizing disk IOs, CPU preprocessing and GPU nearest neighbor search, all of this on up to 1000 GPUs with preemption-aware logic.
  • My latest text simplification method achieves state-of-the-art results on all simplification benchmarks while being unsupervised.
  • Integrated my code and models into Facebook’s codebase and infra. My code is currently being ported in production.
  • Major success of the release of our open-source CamemBERT model (>300k downloads/month, industry applications.
  • Presented my research in person to the French Minister for People with disabilities (my research is core to the Cap’FALC project, involving the French Government, UNAPEI association, Inria and FAIR).
  • My work was featured in 9 news articles/podcasts, I gave 3 interviews (VentureBeat, Le Monde, France Culture, ActuIA, …).
NLP Research Intern
May 2017 – Nov 2017 Paris, France
  • Integrated a neural conversational model in Facebook Messenger (PHP and Python) as an internal chatbot demo.
  • Trained a neural model for question generation on 100k question/answer pairs as an additional module to the chatbot.
Computer Vision R&D Intern
Feb 2016 – Jul 2016 Paris, France
  • Developed and trained a compact CNN architecture (~3MB) with state-of-the-art performance for face recognition.
  • I implemented on-device neural inference for Android (Java and Tensorflow).
  • My work led to the submission and acceptance of a US patent (sole inventor) on compact neural models.

Other Publications

Efficient Large Scale Language Modeling with Mixtures of Experts
Mixture of Experts layers (MoEs) enable efficient scaling of language models through conditional computation. This paper presents a …
Rethinking Automatic Evaluation in Sentence Simplification
Reference-less Quality Estimation of Text Simplification Systems