Samuel Holt

Samuel Holt

PhD Researcher

University of Cambridge

Biography

Hi, I am currently an RS intern at Google DeepMind. I am also a fourth-year Ph.D. student in Machine Learning at the University of Cambridge advised by Mihaela van der Schaar in the Machine Learning and Artificial Intelligence group. To date, I have published nine papers in top-tier ML conferences (NeurIPS [spotlight], ICML [long oral], ICLR [spotlight] and AISTATS).

Interests
  • Robotics
  • Large Language Models (LLMs)
  • LLM Code Generation
  • Transformer Architectures
  • Reinforcement Learning (model-free and model-based)
  • Generative Models
  • Control
  • Time Series Forecasting
  • Symbolic Regression (discovery)
  • Continuous-time models (ODEs and Laplace transforms)
  • Applications to healthcare and scientific discovery
Education
  • PhD in Machine Learning, 2021 - 2025

    University of Cambridge

  • MEng in Engineering Science, 2013 - 2017

    University of Oxford

Academic Service & Volunteering

 
 
 
 
 
Reviewer
review
October 2021 – Present
  • Conference Reviewer: ICML 2022, AISTATS 2023, NeurIPS 2023, ICLR 2024, ICML 2024, Nature Machine Intelligence 2024, NeurIPS 2024.
  • Workshop Reviewer: ICLR 2023 AI4ABM, NeurIPS 2022 & 2023 SyntheticData4ML.
 
 
 
 
 
Online Course | Packt Publishing
Machine Learning Course Teacher
October 2019 – June 2020

Sole Author and Teacher for Machine Learning and Deep Learning Course.

  • Authored ML video course, nine chapters, 10.5 video hours covering theory and code examples of ML and Deep Learning in Supervised Learning, Unsupervised Learning and Reinforcement Learning.
  • Covered Deep Learning for Computer Vision (GANs, VAEs, Style Transfer, Semantic Segmentation, CNNs), NLP, RNNs, Sequence to Sequence, Transformers, Time Series forecasting and Deep Reinforcement Learning (MDPs, Q-Learning, Value and Policy based methods, Multi-armed bandits, Inverse RL, Model-based RL, i.e. Alpha Zero).
  • All teaching Jupyter notebooks are online. Also contributed to open source ML frameworks, TensorFlow Core, OpenAI libraries and ML Wikipedia pages.
 
 
 
 
 
Founders Academy
Data Science Teacher
Founders Academy
February 2020 – February 2020
Week long course, classroom of 20 students introducing data science and machine learning.

Skills

Programming

Python, Javascript, Typescript, RUST, MATLAB, Bash, SQL, C, C++

Libraries

Jax, TensorFlow, PyTorch, Keras, NumPy, SciPy, Pandas, Asyncio, Nltk, Jupyter, PyTest

Software

git, Linux, LaTeX, Google Cloud Platform, Amazon Web Services, Docker, GitLab CI

Invited Talks

Deep Learning in the Laplace Domain
AZ: Scientific Discovery through Scaling Symbolic Regression
Inspiration Exchange: Neural Differential Equations

Recent Publications

(2024). Automatically Learning Hybrid Digital Twins of Dynamical Systems. (NeurIPS 2024).

PDF Cite

(2024). Discovering Preference Optimization Algorithms with and for Large Language Models. (NeurIPS 2024).

PDF Cite Code Project

(2024). L2MAC: Large Language Model Automatic Computer for Extensive Code Generation. (ICLR 2024).

PDF Cite Code

(2024). ODE Discovery for Longitudinal Heterogeneous Treatment Effects Inference. (ICLR 2024) [Spotlight, top 5% of papers].

PDF Cite

(2023). Active Observing in Continuous-time Control. (NeurIPS 2023).

PDF Cite Code

(2023). Deep Generative Symbolic Regression. (ICLR 2023).

PDF Cite Code Video

(2023). Neural Laplace Control for Continuous-time Delayed Systems. (AISTATS 2023).

PDF Cite Code Video

(2022). Neural Laplace: Learning diverse classes of differential equations in the Laplace domain. (ICML 2022) [Long Oral, top 2% of papers].

PDF Cite Code Video

Contact