About Me
I am currently a Research Fellow working with Professor Arthur Gretton in Gatsby Unit at the University College London. Before joining UCL, I obtained my PhD in Department of Statistics at University of Oxford, where I worked with Professor Dino Sejdinovic.
Research Interest
My research interests include
- Kernel Methods
- Learning Theory
- Fast Kernel Approximation
- Meta Learning
- Nonparametric Statistics
Publication
(* indicates equal contribution)
Optimal rates for regularized conditional mean embedding learning Zhu Li*, Dimitri Meunier*, Mattes Mollenhauer, and Arthur Gretton. Advances in Neural Information Processing Systems 35 (2022): 4433-4445. (Oral Presentation)
Benign overfitting and noisy features. Zhu Li, Weijie J. Su, and Dino Sejdinovic. Journal of the American Statistical Association (2022): 1-13.
Sharp Analysis of Random Fourier Features in Classification. Zhu Li. In Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 7, pp. 7444-7452. 2022.
Kernel dependence regularizers and Gaussian processes with applications to algorithmic fairness. Zhu Li, Adrián Pérez-Suay, Gustau Camps-Valls, and Dino Sejdinovic. Pattern Recognition 132 (2022): 108922.
Towards an understanding of benign overfitting in neural networks. Zhu Li, Zhi-Hua Zhou, and Arthur Gretton. arXiv preprint arXiv:2106.03212 (2021).
Towards a unified analysis of random fourier features. Zhu Li, Jean-Francois Ton, Dino Oglic, and Dino Sejdinovic. The Journal of Machine Learning Research 22, no. 1 (2021): 4887-4937.
Towards a unified analysis of random fourier features. Zhu Li, Jean-Francois Ton, Dino Oglic, and Dino Sejdinovic. In International conference on machine learning, pp. 3905-3914. PMLR, 2019. (Best Paper Honourable Mention)