Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Comput Commun ; 206: 101-109, 2023 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-37197298

RESUMEN

Federated learning is a machine learning method that can break the data island. Its inherent privacy-preserving property has an important role in training medical image models. However, federated learning requires frequent communication, which incur high communication costs. Moreover, the data is heterogeneous due to different users' preferences, which may degrade the performance of models. To address the problem of statistical heterogeneity, we propose FedUC, an algorithm to control the uploaded updates for federated learning, where a client scheduling method is made on the basis of weight divergence, update increment, and loss. We also balance the local data of the clients by image augmentation to mitigate the impact of the non-independently identically distribution. The server assigns compression thresholds to the clients based on the weight divergence and update increment of the models for gradient compression to reduce the wireless communication costs. Finally, based on the weight divergence, update increment and accuracy, the server dynamically assigns weights to the model parameters for the aggregation. Simulation and analysis utilizing a publicly available chest disease dataset containing COVID-19 are compared with existing federated learning methods. Experimental results show that our proposed strategy has better training performance in improving model accuracy and reducing wireless communication costs.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA