Noel Loo

PhD Student
Machine Learning
Massachusetts Institute of Technology

Email  /  Google Scholar  /  Github

profile photo
About Me

I am a PhD student at the MIT Computer Science & Artificial Intelligence Lab, advised by Prof. Daniela Rus and Dr. Ramin Hasani. I am interested in making deep learning more efficient, particularly by reducing the data requirements of large models. Recently, I have been looking at the dataset distillation problem, and how recent insights into the training dynamics of deep networks can be used to tackle it.

Previously, I obtained an MEng and BA in Engineering at the University of Cambridge. There, I worked with Prof. Richard E. Turner on probabilistic models for continual learning.

Publications
Generalized Variational Continual Learning
Noel Loo, Siddharth Swaroop, Richard E. Turner
ICLR, 2021
arXiv

We show that two popular continual learning algorithms, Variational Continual Learning (VCL), and Online-Elastic Weight Consolidation (OEWC), can be generalized into a single unified algorithm, Generalized Variational Continual Learning (GVCL). Additionally, we introduce FiLM layers as an architectural change to combat the over-pruning effect in VCL.

Template