论文标题
FEDTREES:在智能网格中调查的一种新颖的计算通信有效联合学习框架
FedTrees: A Novel Computation-Communication Efficient Federated Learning Framework Investigated in Smart Grids
论文作者
论文摘要
供应商和消费者水平的智能能源绩效监测和优化对于实现智能城市至关重要。为了实施更可持续的能源管理计划,进行更好的能源预测至关重要。下一代智能电表也可用于测量,记录和报告能源消耗数据,该数据可用于训练机器学习(ML)模型,以预测能源需求。但是,共享细粒度的能量数据并执行集中学习可能会损害用户的隐私,并使他们容易受到多种攻击的影响。这项研究通过利用联合学习(FL)来解决此问题,这是一种在数据所在的用户级别执行ML模型培训的新兴技术。我们介绍了Fedtrees,这是一个新的,轻巧的FL框架,从合奏学习的出色功能中受益。此外,我们开发了一种基于三角洲的早期停止算法来监视FL培训并在不需要继续时停止算法。模拟结果表明,FedTrees的表现优于最受欢迎的联邦平均(FedAvg)框架和基线持久性模型,用于提供准确的能量预测模式,而与FedAvg相比,仅占用了2%的计算时间和13%的沟通回合,并节省了相当多的计算和沟通资源。
Smart energy performance monitoring and optimisation at the supplier and consumer levels is essential to realising smart cities. In order to implement a more sustainable energy management plan, it is crucial to conduct a better energy forecast. The next-generation smart meters can also be used to measure, record, and report energy consumption data, which can be used to train machine learning (ML) models for predicting energy needs. However, sharing fine-grained energy data and performing centralised learning may compromise users' privacy and leave them vulnerable to several attacks. This study addresses this issue by utilising federated learning (FL), an emerging technique that performs ML model training at the user level, where data resides. We introduce FedTrees, a new, lightweight FL framework that benefits from the outstanding features of ensemble learning. Furthermore, we developed a delta-based early stopping algorithm to monitor FL training and stop it when it does not need to continue. The simulation results demonstrate that FedTrees outperforms the most popular federated averaging (FedAvg) framework and the baseline Persistence model for providing accurate energy forecasting patterns while taking only 2% of the computation time and 13% of the communication rounds compared to FedAvg, saving considerable amounts of computation and communication resources.