Publications

You can also find my articles on my Google Scholar profile.

Photon: Federated LLM Pre-Training

Published in arXiv, 2024

We present Photon, the first fully federated system for the federated pre-training of large language models.

Recommended citation: Lorenzo Sani, Alex Iacob, Zeyu Cao, Royson Lee, Bill Marino, Yan Gao, Dongqi Cai, Zexi Li, Wanru Zhao, Xinchi Qiu, & Nicholas D. Lane. (2024). Photon: Federated LLM Pre-Training.
Download Paper

DEPT: Decoupled Embeddings for Pre-training Language Models.

Published in arXiv, 2024

This work proposes a novel pre-training method for language models that decouples the embeddings from the rest of the model.

Recommended citation: Alex Iacob, Lorenzo Sani, Meghdad Kurmanji, William F. Shen, Xinchi Qiu, Dongqi Cai, Yan Gao, & Nicholas D. Lane. (2024). DEPT: Decoupled Embeddings for Pre-training Language Models.
Download Paper

Sheaf HyperNetworks for Personalized Federated Learning

Published in arXiv, 2024

Sheaf HyperNetworks is a novel HyperNetwork-based methodology to improve the performance of any personalized federated learning setting.

Recommended citation: Bao Nguyen, Lorenzo Sani, Xinchi Qiu, Pietro Liò, & Nicholas D. Lane. (2024). Sheaf HyperNetworks for Personalized Federated Learning.
Download Paper

Published in , 1900

Published in , 1900

Flower: A Friendly Federated Learning Research Framework

Published in arXiv, 2022

This paper is about Flower, the federated learning framework developed at the University of Cambridge by the MLSys group.

Recommended citation: Beutel, D., Topal, T., Mathur, A., Qiu, X., Fernandez-Marques, J., Gao, Y., Sani, L., Kwing, H., Parcollet, T., Gusmão, P., & Lane, N. (2020). Flower: A Friendly Federated Learning Research Framework. arXiv preprint arXiv:2007.14390.
Download Paper