Redirigiendo al acceso original de articulo en 22 segundos...
Inicio  /  Applied Sciences  /  Vol: 13 Par: 23 (2023)  /  Artículo
ARTÍCULO
TITULO

FedRDS: Federated Learning on Non-IID Data via Regularization and Data Sharing

Yankai Lv    
Haiyan Ding    
Hao Wu    
Yiji Zhao and Lei Zhang    

Resumen

Federated learning (FL) is an emerging decentralized machine learning framework enabling private global model training by collaboratively leveraging local client data without transferring it centrally. Unlike traditional distributed optimization, FL trains the model at the local client and then aggregates it at the server. While this approach reduces communication costs, the local datasets of different clients are non-Independent and Identically Distributed (non-IID), which may make the local model inconsistent. The present study suggests a FL algorithm that leverages regularization and data sharing (FedRDS). The local loss function is adapted by introducing a regularization term in each round of training so that the local model will gradually move closer to the global model. However, when the client data distribution gap becomes large, adding regularization items will increase the degree of client drift. Based on this, we used a data-sharing method in which a portion of server data is taken out as a shared dataset during the initialization. We then evenly distributed these data to each client to mitigate the problem of client drift by reducing the difference in client data distribution. Analysis of experimental outcomes indicates that FedRDS surpasses some known FL methods in various image classification tasks, enhancing both communication efficacy and accuracy.