Loading [MathJax]/jax/output/HTML-CSS/config.js
Volume 6, Issue 2
Review of Mathematical Optimization in Federated Learning

Shusen Yang, Fangyuan Zhao, Zihao Zhou, Liang Shi, Xuebin Ren & Zongben Xu

CSIAM Trans. Appl. Math., 6 (2025), pp. 207-249.

Published online: 2025-05

Export citation
  • Abstract

Federated learning (FL) has been becoming a popular interdisciplinary research area in both applied mathematics and information sciences. Mathematically, FL aims to collaboratively optimize aggregate objective functions over distributed datasets while satisfying a variety of privacy and system constraints. Different from conventional distributed optimization methods, FL needs to address several specific issues (e.g. non-i.i.d. data and differential private noises), which pose a set of new challenges in the problem formulation, algorithm design, and convergence analysis. In this paper, we will systematically review existing FL optimization research including their assumptions, formulations, methods, and theoretical results. Potential future directions are also discussed.

  • AMS Subject Headings

90C26, 90C31, 68W15, 68T05

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CSIAM-AM-6-207, author = {Yang , ShusenZhao , FangyuanZhou , ZihaoShi , LiangRen , Xuebin and Xu , Zongben}, title = {Review of Mathematical Optimization in Federated Learning}, journal = {CSIAM Transactions on Applied Mathematics}, year = {2025}, volume = {6}, number = {2}, pages = {207--249}, abstract = {

Federated learning (FL) has been becoming a popular interdisciplinary research area in both applied mathematics and information sciences. Mathematically, FL aims to collaboratively optimize aggregate objective functions over distributed datasets while satisfying a variety of privacy and system constraints. Different from conventional distributed optimization methods, FL needs to address several specific issues (e.g. non-i.i.d. data and differential private noises), which pose a set of new challenges in the problem formulation, algorithm design, and convergence analysis. In this paper, we will systematically review existing FL optimization research including their assumptions, formulations, methods, and theoretical results. Potential future directions are also discussed.

}, issn = {2708-0579}, doi = {https://doi.org/10.4208/csiam-am.SO-2024-0023}, url = {http://global-sci.org/intro/article_detail/csiam-am/24085.html} }
TY - JOUR T1 - Review of Mathematical Optimization in Federated Learning AU - Yang , Shusen AU - Zhao , Fangyuan AU - Zhou , Zihao AU - Shi , Liang AU - Ren , Xuebin AU - Xu , Zongben JO - CSIAM Transactions on Applied Mathematics VL - 2 SP - 207 EP - 249 PY - 2025 DA - 2025/05 SN - 6 DO - http://doi.org/10.4208/csiam-am.SO-2024-0023 UR - https://global-sci.org/intro/article_detail/csiam-am/24085.html KW - Federated learning, distributed optimization, convergence analysis, error bounds. AB -

Federated learning (FL) has been becoming a popular interdisciplinary research area in both applied mathematics and information sciences. Mathematically, FL aims to collaboratively optimize aggregate objective functions over distributed datasets while satisfying a variety of privacy and system constraints. Different from conventional distributed optimization methods, FL needs to address several specific issues (e.g. non-i.i.d. data and differential private noises), which pose a set of new challenges in the problem formulation, algorithm design, and convergence analysis. In this paper, we will systematically review existing FL optimization research including their assumptions, formulations, methods, and theoretical results. Potential future directions are also discussed.

Yang , ShusenZhao , FangyuanZhou , ZihaoShi , LiangRen , Xuebin and Xu , Zongben. (2025). Review of Mathematical Optimization in Federated Learning. CSIAM Transactions on Applied Mathematics. 6 (2). 207-249. doi:10.4208/csiam-am.SO-2024-0023
Copy to clipboard
The citation has been copied to your clipboard