About Me
Welcome! I am a final-year Ph.D. student at the Department of Computer Science, Emory University, where I am fortunate to be advised by Dr. Liang Zhao. Previously, I received my master’s degree in Statistics from George Washington University in 2020. I received my bachelor’s degree in Mathematics from the School of Mathematical Science, Fudan University in Shanghai, China in 2018. I worked as a research intern at Argonne National Laboratory and NEC Lab America.
Research Interests
I am interested in designing efficient and generalizable learning algorithms. Specifically, my current research topics include but are not limited to 1. Developing various learning algorithms for knowledge/domain transfer, such as multi-task learning (MTL), domain adaptation (DA), and domain generalization (DG). 2. Designing large-scale machine learning algorithms with enhanced efficiency, such as model compression & acceleration of LLMs and distributed training for deep neural networks. 3. Online learning such as continual/lifelong learning with efficient memory replay and neuro-inspiration.
Selected Projects
1. Domain and Knowledge Transfer
Enhancing machine learning models’ adaptability and effectiveness across various domains/tasks.
a) Multi-Task Learning
b) Domain Adaptation/Generalization
- Prompt-Based Domain Discrimination for Multi-source Time Series Domain Adaptation
KDD 2024 - Temporal Domain Generalization with Drift-Aware Dynamic Neural Networks
ICLR 2023 (Oral)
2. Efficient Large-Scale Machine Learning
Exploring scalable solutions in machine learning.
a) Model Compression & Acceleration of LLMs
- Beyond Efficiency: A Systematic Survey of Resource-Efficient Large Language Models
Preprint - SparseLLM: Towards Global Pruning for Pre-trained Language Models
Preprint
b) Distributed Training for Deep Neural Networks
3. Neuro-Inspired Continual Learning
Focusing on memory-replay and neuro-inspiration approaches for continual learning.
- Saliency-Guided Hidden Associative Replay for Continual Learning
AMHN Workshop @NeurIPS 2023 - Saliency-Augmented Memory Completion for Continual Learning
SDM 2023
Services and Awards
- PC member for AISTATS (23’24’), NeurIPS (22’23’), ICLR (24’), AAAI (24’)
- Reviewer for KDD, ICML, ICLR, ICDM
- 2023 SDM student travel award
- 2022 CIKM student travel award
- 2022 KDD student travel award