About

I am a research scientist at the Tongyi Lab, Alibaba Group. I obtained the Ph.D. in Artificial Intelligence from the College of Computer Science and Technology, Zhejiang University (ZJU), China, supervised by Prof. Chao Wu. I was a visiting researcher at the University of Cambridge under the supervision of Prof. Nicholas Lane. I am honored to work with Prof. Tao Lin at Westlake University/EPFL.

My main research interests lie in the following aspects. Large Language models and agentic intelligence. i. LLM agent, reasoning, and multi-agent system. ii. Model editing and memory management for large language models (LLMs). iii. Parametric understanding of LLMs (localization, merging, scaling, pruning, stitching, unlearning, editing, and etc.) iv. Text-to-model generation. Trustworthy deep learning. i. Privacy-preserving federated learning (FL), efficient & robust algorithm design, and generalization, personalization & training dynamics understanding. ii. Mechanistic interpretability of neural networks, weight decay, loss landscape, permutation invariance, linear mode connectivity, and etc. iii. Socio-technical issues brought by collaborative learning. iv. Responsible and trustworthy AI.

Contact

I am recruiting Research Interns at Tongyi Lab, Alibaba Group, working on LLM agent, LLM multi-agent system, reinforcement learning for agentic LLM. If you are interested, don't hesitate to contact!

Email: zexi.li[at]zju.edu.cn / tomleeze[at]gmail.com

Wechat and Phone: (+86) 18868104540

Selected Publications

Recent News

  • [2025.07] I am happy to share that we have one paper recently accepted by ACM MM 2025 about text-to-weight generation and one paper accepted by Machine Learning (journal) about personalized federated learning!
  • [2025.07] I joined Tongyi Lab, Alibaba Group, as a research scientist.
  • [2025.07] Our paper “Resource-Efficient Knowledge Editing for Mobile LLMs” has won the Best Poster Award at MobiUK 2025 in Edinburgh!
  • [2025.07] I am invited to serve as the Session Chair of KDD 2025.
  • [2025.06] I passed the PhD defense and graduated from the College of Computer Science and Technology, Zhejiang University.
  • [2025.06] Our paper “You Are Your Own Best Teacher: Achieving Centralized-level Performance in Federated Learning under Heterogeneous and Long-tailed Data” is accepted by ICCV 2025!
  • [2025.05] I am happy to share our new preprint “Editing as Unlearning: Are Knowledge Editing Methods Strong Baselines for Large Language Model Unlearning?”! We try to build a bridge connecting LLM editing and unlearning communities and find that editing methods are strong baselines for LLM unlearning to some extent.
  • [2025.05] Our paper titled “FedGuCci: Making Local Models More Connected in Landscape for Federated Learning” is accepted by KDD 2025!
  • [2025.04] Our paper titled “Towards Universal Personalization in Federated Learning via Collaborative Foundation Generative Models” is accepted by IEEE Transactions on Mobile Computing! Paper will be released soon.
  • [2025.03] I will be the Associate Chair of FedKDD 2025, International Joint Workshop on Federated Learning for Data Mining and Graph Analytics, Co-located with the 31st ACM SIGKDD Conference (KDD 2025). Submissions are welcome!
  • [2024.09] I am happy to share our WISE, a model editor for large language models’ lifelong model editing, is accepted to NeurIPS 2024!
  • [2024.05] Two papers are accepted by KDD 2024, and one paper is accepted by ICML 2024.

Academic Service

  • Associate Chair of FedKDD 2025, International Joint Workshop on Federated Learning for Data Mining and Graph Analytics, Co-located with the 31st ACM SIGKDD Conference (KDD 2025).
  • Session Chair of KDD 2025.
  • Invited Reviewers: TKDE, IJCV, TMM, Machine Learning, AISTATS 2024, CVPR 2024, ICML 2024 2025, NeurIPS 2024 2025, ICLR 2024 2025, KDD 2025, ICCV 2025.

Talks

  • [2025.05.30] 浙江省科协“科学+”平台;大模型时代,AI如何记忆与思考?
  • [2025.04.28] Xtra Lab, National University of Singapore; Foundation Models under Model Parameter Perspective: Model Editing, Fusion, and Generation.
  • [2024.11.14] Department of Computer Science and Technology, University of Cambridge; Physics in (Federated) Deep Neural Networks and Beyond: A Parametric Perspective.