Mingjia (Samuel Jayden) Shi’s Homepage.
Here’s the page (academic CV) about Mingjia (Samuel Jayden) Shi, 石明佳 in Chinese. An ENFP who shouldn’t say too much, but especially wants to. If you think the one in the sidebar is too casual, I have a formal version. You should focus more on the content below than here! Feel free to contact me for collaboration, discussion or just to say hi!
👨🎓 Biography
- It’s the 2nd year (written in 2024) as an intern student in NUS HPC-Lab, and I enojoy the challenges and interesting topics here (e.g. Resource Preserving and Efficiency.).
- From Sep. 2021 to June. 2024, I completed my master degree at Sichuan University majored in Artificial Intelligence (Data Science), right where I had completed my 4-year bachelor’s degree before, supervised by Prof. Jiancheng Lv. The majors of my career in Sichuan University are federated learning (i.e., Decentralized Data Analyses and Large System).
🎉 Latest News
- [Jan. 25] 25 Fall PhD and internship in my gap year are decided. Interesting collaborations are still welcome.
Old ones before 2025
[Dec. 24] Waiting for 2025 Fall PhD and projects in my gap year.
[Aug. 24] Actively applying for a 2025 Fall PhD! If you are interested in a student familiar with theoretical analysis, feel free to mail!
👣 Current Career
During my research period, as an author and a reviewer of Top conferences and journals, I have appreciated the fascination and what I want to do, so I pursue a PhD career further. I am busy with my visa now.
🔍 Research Interests
My research focus on Resource Preserving and …
- 🖐️ Works in hands. My works are mainly both theoretical analyses and corresponding methods about Resource Preserving and Efficiency on Trends.
- 🎓 Overall background. A knowledge and research background about math+cs, system/control/information theories, deep learning thoeries, optimization and generalization analyses.
- 🌟 More-interest. There is a continuing interest in technical research as well as basic science research. Physics and other science disciplines are always beautiful.
📄 Selected Publications
📅 2025:
CVPR 2025 A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training. K. Wang*, M. Shi*, Y. Zhou, Z. Li, Z. Yuan, Y. Shang, X. Peng, H. Zhang, Y. You (paper, code)
CVPR 2025 Ferret: An Efficient Online Continual Learning Framework under Varying Memory Constraints. Y. Zhou, Y. Tian, J. Lv, M. Shi, Y. Li, Q. Ye, S. Zhang, J. Lv (paper and code to be released)
TNNLS E-3SFC: Communication-Efficient Federated Learning With Double-Way Features Synthesizing. Y. Zhou, Y. Tian, M. Shi, Y. Li, Y. Sun, Q. Ye, J. Lv (paper, code to be released)
📅 Early Selected:
Released Pre-Print
- Arxiv Faster Vision Mamba is Rebuilt in Minutes via Merged Token Re-training. M. Shi*, Y. Zhou*, R. Yu, Z. Li, Z. Liang, X. Zhao, X. Peng, T Rajpurohit, R. Vedantam, W. Zhao, K. Wang, Y. You. (paper, code, page)
- Arxiv Tackling Feature-Classifier Mismatch in Federated Learning via Prompt-Driven Feature Transformation. X. Wu, J. Niu, X. Liu, M. Shi, G. Zhu, S. Tang (paper)
- Arxiv GSQ-Tuning: Group-Shared Exponents Integer in Fully Quantized Training for LLMs On-Device Fine-tuning. S. Zhou, S. Wang, Z. Yuan*, M. Shi, Y. Shang, D. Yang (paper)
Conference
- ICASSP 2024 Federated CINN Clustering for Accurate Clustered Federated Learning. Y. Zhou, M. Shi, Y. Tian, Y. Li, Q. Ye, J. Lv (paper)
- NeurIPS 2023 PRIOR: Personalized Prior for Reactivating the Information Overlooked in Federated Learning. M. Shi, Y. Zhou, K. Wang, H. Zhang, S. Huang, Q. Ye, J. Lv (paper, code)
- ICONIP 2023 Unconstrained Feature Model and Its General Geometric Patterns in Federated Learning: Local Subspace Minority Collapse. M. Shi, Y. Zhou, Q. Ye, J. Lv (paper)
- ICCV 2023 Communication-efficient Federated Learning with Single-Step Synthetic Features Compressor for Faster Convergence. Y. Zhou, M. Shi, Y. Li, Y. Sun, Q. Ye, J. Lv (paper)
Journal
- InfoSci DeFTA: A Plug-and-Play Peer-to-Peer Decentralized Federated Learning Framework. Y. Zhou, M. Shi, Y. Tian, Q. Ye, J. Lv (paper)
- Trans.ETCI DLB: a dynamic load balance strategy for distributed training of deep neural networks. Q. Ye, Y. Zhou, M. Shi, Y. Sun, J. Lv (paper)
- JoSc FLSGD: free local SGD with parallel synchronization. Q. Ye, Y. Zhou, M. Shi, J. Lv (paper)
Please see the full list in Google Scholar
🛎 Services
Reviewer: NeurIPs, ICLR, ICML, CVPR, ICCV, MM, ICPP, ICASSP, ICONIP; TNNLS, TCSVT, TPDS, TKDE, Neurocomputing and others.
🎈 Hobbies
📷 Photograph, ⛺ Travel, 🎵 Music, 🏸 Badminton, ⚽ Football (Soccer), ♟️ Chess and other Table-top Games, 🎱 Billiards, 🎮 Games, 🔬 Study on interesting topics,