| CARVIEW |
Song Liu (柳松)
Assoc. Professor in Data Science and AI,
School of Mathematics,
University of Bristol.
Office GA.18
Fry Building,
Woodland Road, BS8 1UG.
Hobby Project:
Juzhen, a C++ library for fast numerical computation and neural net applicaitons.
TL;DR
I am a statistical machine learning researcher and have been mostly working on Density Ratio Estimation. I have a vision that by comparing two density functions, many machine learning problems can be solved more elegantly and efficiently. You can read more about my vision on Machine Learning via Statistical Discrepancies.
I am happy to accept new PhD students. If you are interested in working in Machine Learning at Bristol, you may find several funding opportunities, such as Prob_AI, Informed AI and CSC scholarship.
Recent Publications (*post/undergrad students I supervise)
-
(NeurIPS2025), Yu, J.*, Ying Q., Wang, L., Jiang, Z., Liu, S., Missing Data Imputation by Reducing Mutual Information with Rectified Flows, NeurIPS2025, arxiv
- (NeurIPS2025), Khoo, S.*, Wang, Y.*, Liu, S., Beaumont, M., Direct Fisher Score Estimation for Likelihood Maximization, NeurIPS2025, arxiv
- Spotlight, Top 3%
- (UAI2025), Liu, S., Wang, L.*, Wang, Y.*, Guiding Time-Varying Generative Models by Natural Gradients on Exponential Family Manifold, UAI2025, arxiv, 2025.
- Outstanding long paper, at Deep Generative Model in Machine Learning: Theory, Principle and Efficacy workshop at ICLR2025.
- (ICML2025), Givens, J. *, Liu, S., Reeve, H., Score Matching with Missing Data, ICML2025.
-
(TPEL2025), Wang, Y.*, Liu, S., Wang, J., Cui, B., Yang J., Machine Learning Based Probe Skew Correction for High-frequency BH Loop Measurements, IEEE Transactions on Power Electronics, arxiv, 2025
-
(AISTATS2025), Williams, D. J.*#, Wang, L.*#, Ying, Q.*, Liu, S., Kolar M., High-Dimensional Differential Parameter Inference in Exponential Family using Time Score Matching, AISTATS2025, arxiv. #Equal Contribution.
-
(BDU workshop at NeurIPS2024), Wang, Y.*, Khoo, S.*, Liu, S., Bayesian Decision-making and Uncertainty workshop: Lightspeed Black-box Bayesian Optimization via Local Score Matching, NeurIPS 2024 Workshop on Bayesian Decision-making and Uncertainty, openreview
-
(NeurIPS2024), Givens, J.*, Reeve, H., Liu, S., Reluga, K., Conditional Outcome Equivalence: A Quantile Alternative to CATE, NeurIPS2024, arxiv
-
(ICML2024), Liu, S., Yu, J., Simons, J.*, Yi, M.*, Beaumont, M., Minimizing $f$-Divergences by Interpolating Velocity Fields, ICML2024, arxiv.
- (ICML2024), Sharrock, L.#, Simons, J.#*, Liu, S., Beaumont, M., Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models, ICML2024, arxiv. #Equal contribution,
- Spotlight paper, Top 3.5%
-
(OE2024), Zhang, Y.*, Pan, Z., Macdonald, J.H.G., Liu, S., Harper, P., Structural damage detection based on multivariate probability density functions of vibration data of offshore wind foundations with comparison studies, Ocean Engineering, link.
-
[AABI2023], Sharrock, L., Simons, J.*, Liu S., Beaumont, M., Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models, AABI2023, arxiv.
-
(ICML2023), Williams, D. J.*, Liu, S., Approximate Stein Classes for Truncated Density Estimation, ICML2023, arxiv.
-
(ICML2023), Yi, M.*, Zhu, Z., Liu, S., MonoFlow: Rethinking Divergence GANs via the Perspective of Differential Equations, ICML2023, arxiv.
-
(AISTATS2023), Givens, J.*, Reeve, H., Liu, S., Density Ratio Estimation and Neyman Pearson Classification with Missing Data, AISTATS2023, arxiv.
-
(ACML2022) Yi, M.*, Liu, S., Sliced Wasserstein Variational Inference (Best Student Paper), ACML 2022. arxiv.
-
(NeurIPS2022) Liu, S., Estimating the Arc Length of the Optimal ROC Curve and Lower Bounding the Maximal AUC, NeurIPS 2022 arxiv.
-
(JMLR2022) Liu, S., Kanamori, T., Williams, D.J.*, Estimating Density Models with Truncation Boundaries using Score Matching , arxiv, 23(186):1−38, Journal of Machine Learning Research, 2022.
-
(AABI2022) Simons, J.*, Liu, S. Beaumont, M., Variational Likelihood-Free Gradient Descent, 4th Symposium on Advances in Approximate Bayesian Inference, 2022.
-
(AABI2022, Contributed Talk) Yi, M.*, Liu, S., Sliced Wasserstein Variational Inference, 4th Symposium on Advances in Approximate Bayesian Inference, 2022.
-
(CACAIE 2021) Zhang Y.*, Macdonald J., online version, Liu S., and Harper P., Damage Detection of Nonlinear Structures Using Probability Density Ratio Estimation, Computer-Aided Civil and Infrastructure Engineering, 2021
-
(TKDE 2021) Wu, XZ., Xu, W., Liu, S., Zhou, ZH., Model Reuse with Reduced Kernel Mean Embedding Specification, doi: 10.1109/TKDE.2021.3086619, IEEE Transactions on Knowledge and Data Engineering, arxiv, 2021
-
(JRSSB 2021) Kim, B., Liu, S., Kolar, M.,Two-sample inference for high-dimensional Markov networks, page 939-962, Volume83, Issue5, Journal of the Royal Statistical Society Series B (JRSSB), arxiv, 2021.
-
(AAAI 2021) Minami, S., Liu, S., Wu, S., Fukumizu, K., Yoshida, R., A General Class of Transfer Learning Regression without Implementation Cost, AAAI Conference on Artificial Intelligence, 2021.
-
(NEURIPS2019) Liu, S., Kanamori, T., Jitkrittum, W., Chen, Y. Fisher Efficient Inference of Intractable Models,_ Advances in Neural Information Processing Systems 32_, 2019.
-
(ICML2019) Wu, X. Z., Liu, S., Zhou, Z.H. Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin, Proceedings of the 36th International Conference on Machine Learning, PMLR 97, 2019.
-
(NC2018) Noh, Y-K., Sugiyama, M., Liu, S., du Plessis, M.C., Park, F.C., and Lee, D. D., Bias Reduction and Metric Learning for Nearest−Neighbor Estimation of Kullback−Leibler Divergence, Neural Computation Vol.30(7), 2018
-
(NEURIPS2017) Liu, S., Takeda, A., Suzuki, T., Fukumizu K., Trimmed Density Ratio Estimation, Advances in Neural Information Processing Systems 30, 2017
-
(AOS2017) Liu, S., Suzuki, T., Relator R., Sese J., Sugiyama, M., Fukumizu, K., Support consistency of direct sparse-change learning in Markov networks. Annals of Statistics, Volume 45, Number 3, 2017
-
(ICML2016) Liu, S., Suzuki, T., Sugiyama, M. Fukumizu K., Structure Learning of Partitioned Markov Networks, Proceedings of the 33rd International Conference on Machine Learning, 2016.
- (SDM 2016) Liu, S., Fukumizu K., Estimating Posterior Ratio for Classification: Transfer Learning from Probabilistic Perspective, Proceedings of 2016 SIAM International Conference on Data Mining, 2016.
PhD students
- Josh Givens, COMPASS, co-supervising with Dr. Henry Reeve.
- Yakun Wang, Math PhD student.
- Sherman Khoo, COMPASS, co-supervising with Prof. Mark Beaumont.
- Luke Shannon, ProbAI, co-supervising with Dr. Katarzyna Reluga
Graduated PhD students
- Yulong Zhang, Civil Engineering PhD student, co-supervised with Prof. John Macdonald and Dr. Paul Harper
- Mingxuan Yi, Math PhD student.
- Daniel Williams, COMPASS.
- Jack Simons, COMPASS, co-supervised with Prof. Mark Beaumont.
Teaching
- Algorithms and Programming in C(++) and R (2021-2024), School of Mathematics.
- Statistical Methods 1 (2019-2023), School of Mathematics.
- Symbols, Patterns and Signals, (2018-2019) Department of Computer Science.
Employment History
- 08/2022 – 07/2024, Senior Lecturer in Statistical Science, School of Mathematics, Uni. of Bristol
- 08/2019 – 07/2022, Lecturer in Statistical Science, School of Mathematics, Uni. of Bristol
- 09/2017 – 08/2019, Lecturer in Data Science and A.I., Department of Computer Science, Uni. of Bristol
- 04/2015 - 09/2017, Project Assistant Professor, The Institute of Statistical Mathematics, Japan.
- 04/2014 - 03/2015, Postdoctoral (JSPS Fellow), Tokyo Institute of Technology, Japan.
Education
-
03/2014, Doctor of Engineering, Tokyo Institute of Technology, Japan. Thesis: Statistical Machine Learning Approaches on Change Detection.
Supervisor: Prof. Masashi Sugiyama -
10/2010, Master of Science with Distinction, University of Bristol, UK.
-
06/2009, Bachelor of Engineering, Soochow University, China.