About Me

Welcome! I am a final-year Ph.D. student at the Department of Computer Science, Emory University, where I am fortunate to be advised by Dr. Liang Zhao. Previously, I received my master’s degree in Statistics from George Washington University in 2020. I received my bachelor’s degree in Mathematics from the School of Mathematical Science, Fudan University in Shanghai, China in 2018. I worked as a research intern at Argonne National Laboratory and NEC Lab America.

News: I am actively looking for Machine Learning Engineer / Applied Scientist / Research Scientist roles starting anytime in 2025. Feel free to DM me!

Research Interests

I am passionate about designing efficient and generalizable learning algorithms with applications across diverse domains. My research focuses on three primary areas:

  1. Efficient Large-Scale Machine Learning: Developing methods to optimize large-scale models, particularly focusing on model compression, inference optimization, and distributed training for deep learning systems, including LLMs.
  2. Domain and Knowledge Transfer: Enhancing machine learning models’ adaptability across various domains/tasks, including multi-task learning, domain adaptation, and domain generalization, with a particular focus on data with temporal concept drift.
  3. Neuro-Inspired Continual Learning: Designing lifelong learning algorithms inspired by neuroscience, incorporating memory replay mechanisms for efficiency and robustness.

Selected Projects

1. Efficient Large-Scale Machine Learning

Exploring scalable and efficient solutions for machine learning systems, with a focus on LLM inference optimization.

a) Inference Optimization of LLMs

b) Distributed Training for Graph Neural Networks


2. Domain and Knowledge Transfer

Enhancing machine learning models’ adaptability and effectiveness across various domains and tasks, with a focus on temporal domain generalization and multi-task learning.

a) Domain Adaptation/Generalization

b) Multi-Task Learning


3. Neuro-Inspired Continual Learning

Developing lifelong learning approaches with efficient memory-replay mechanisms and neuro-inspiration.

Services and Awards

  • PC member for KDD, ICML, ICLR, AISTATS, NeurIPS, AAAI, ICDM, etc.
  • Primary writer for the NSF NAIRR 240189 grant ($15k) on parallel and distributed training of LLMs on graphs.
  • KDD 22’, ICLR 23’, SDM 23’, CIKM 23’, NeurIPS 24’ student travel award