I am a final-year PhD candidate in the Computational and Biological Learning lab at the University of Cambridge, supervised by Prof Richard E Turner and advised by Prof Carl Rasmussen. I am interested in designing algorithms for large-scale machine learning systems that learn sequentially without revisiting past data, are private over user data, and are uncertainty-aware.
I am funded by an EPSRC DTP award. I also hold a Microsoft Research EMEA PhD Award and an Honorary Vice-Chancellor’s Award from the Cambridge Trust.
I am interested in applying Bayesian methods to large neural network models, with the aim of understanding and using them in challenging scenarios. Recently, this has included two machine learning fields that go beyond the common assumption of having all data available at once: continual / lifelong learning (data arrives sequentially and past data cannot all be revisited), and federated learning (data is split among many different clients). Both of these are examples of constraints that deployed algorithms may have to face in the real world. An example application of federated learning is when training a global healthcare model on patient data from different hospitals, where sensitive patient data must not leave local hospital databases.
I have spent time characterising and improving approximate Bayesian inference techniques, usually variational inference (Swaroop et al., 2018; Tomczak et al., 2018; Osawa et al., 2019; Tomczak et al., 2020). I have applied these techniques to continual learning (Swaroop et al., 2018; Bui et al., 2018; Osawa et al., 2019; Pan et al., 2020; Loo et al., 2021) and federated learning (Bui et al., 2018; Sharma et al., 2019). I have been exploring the links and commonalities between these two fields, and continue to actively work in this area. Recently I have also done work looking at the function-space of Bayesian neural networks (Pan et al., 2020).
|Apr 2021||I am writing a 2-part blog on natural-gradient variation inference, due to be published in early April.|
|Jan 2021||Paper at ICLR 2021, Generalized Variational Continual Learning.|
Oral presentation at NeurIPS 2020, Continual Deep Learning by Functional Regularisation of Memorable Past (top 1% of submissions, 105/10K).
Paper at NeurIPS 2020, Efficient Low Rank Gaussian Variational Inference for Neural Networks.
Oral at LifeLongML Workshop (ICML 2020), Combining Variational Continual Learning with FiLM Layers.
Second oral at LifeLongML Workshop (ICML 2020), Continual Deep Learning by Functional Regularisation of Memorable Past.
|Jun 2020||I have been awarded a Microsoft Research EMEA PhD Award to fund my research on function-space Bayesian neural networks for continual learning and federated learning. Also see news article.|
|Dec 2019||Paper at NeurIPS 2019, Practical Deep Learning with Bayesian Principles.|
Oral at Continual Learning Workshop (NeurIPS 2018), Improving and Understanding Variational Continual Learning.
Spotlight at Bayesian Deep Learning Workshop (NeurIPS 2018), Partitioned Variational Inference: A unified framework encompassing federated and continual learning.
Paper at Advances in Approximate Bayesian Inference Symposium (2018), Neural network ensembles and variational inference revisited.
|Jun 2018||Internship at Microsoft Research, Cambridge, supervised by John Winn and Martin Kukla, on knowledge-base construction.|
|Oct 2017||Started my PhD with Professor Richard Turner at the Machine Learning Group.|