论文标题

部分可观测时空混沌系统的无模型预测

Depersonalized Federated Learning: Tackling Statistical Heterogeneity by Alternating Stochastic Gradient Descent

论文作者

Zhou, Yujie, Li, Zhidu, Tang, Tong, Wang, Ruyan

论文摘要

联合学习(FL)最近引起了越来越多的关注,可以使分布式设备能够训练通用的机器学习(ML)模型,以合作而无需数据共享。 但是,实用网络中的问题,例如非独立且相同分布的(非IID)原始数据和有限的带宽,会导致FL训练过程的缓慢和不稳定的收敛性。 为了解决这些问题,我们提出了一种新的FL方法,该方法可以通过取消人格化机制大大减轻统计异质性。 特别是,我们通过交替的随机梯度下降来将全局和局部优化目标解除,从而降低了局部更新阶段的累积差异以加速FL收敛。 然后,我们详细分析了提出的方法,以显示在一般非凸设置中以均匀速度收敛的提议方法。 最后,通过在公共数据集上的实验进行数值结果,以验证我们提出的方法的有效性。

Federated learning (FL), which has gained increasing attention recently, enables distributed devices to train a common machine learning (ML) model for intelligent inference cooperatively without data sharing. However, problems in practical networks, such as non-independent-and-identically-distributed (non-iid) raw data and limited bandwidth, give rise to slow and unstable convergence of the FL training process. To address these issues, we propose a new FL method that can significantly mitigate statistical heterogeneity through the depersonalization mechanism. Particularly, we decouple the global and local optimization objectives by alternating stochastic gradient descent, thus reducing the accumulated variance in local update phases to accelerate the FL convergence. Then we analyze the proposed method detailedly to show the proposed method converging at a sublinear speed in the general non-convex setting. Finally, numerical results are conducted with experiments on public datasets to verify the effectiveness of our proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源