Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Federated learning (FL) trains models across multiple devices while keeping data localized, addressing privacy and efficiency issues. It faces challenges like data heterogeneity-where diverse local data might hinder the performance of a global model-and high computational costs on resource-limited devices like smartphones or IoT sensors. These constraints can prolong training times, increase energy use, and hasten device wear. To address these issues, this paper introduces FedDyS, a dynamic sample selection technique that reduces computational demands and mitigates data heterogeneity by eliminating non-essential training samples. This not only shrinks the training set size on local devices but also enhances data diversity, facilitating more efficient training and preserving data privacy. Additionally, FedDyS prevents the catastrophic forgetting effect, a common challenge in FL. Our experiments show that FedDyS surpasses traditional FL methods in accuracy and convergence speed, using less than 15% of the usual number of samples, making it ideal for low-resource settings. The code of FedDyS is publicly available: https://github.com/ensiyeKiya/FedDyS

Original publication

DOI

10.1109/ISCC61673.2024.10733603

Type

Publication Date

01/01/2024