As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Federated Learning coordinates multiple clients to collaboratively train a shared model while preserving data privacy. However, the training data with noisy labels located on the participating clients severely harm the model performance. In this paper, we propose FedCoop, a cooperative Federated Learning framework for noisy labels. FedCoop mainly contains three components and conducts robust training in two phases, data selection and model training. In the data selection phase, in order to mitigate the confirmation bias caused by a single client, the Loss Transformer intelligently estimates the probability of each sample’s label to be clean through cooperating with the helper clients, which have high data trustability and similarity. After that, the Feature Comparator evaluates the label quality for each sample in terms of latent feature space in order to further improve the robustness of noisy label detection. In the model training phase, the Feature Matcher trains the model on both the noisy and clean data in a semi-supervised manner to fully utilize the training data and exploits the feature of global class to increase the consistency of pseudo labeling across the clients. The experimental results show FedCoop outperforms the baselines on various datasets with different noise settings. It effectively improves the model accuracy up to 62% and 27% on average compared with the baselines.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.