loading subjects...

Federated learning (FL) is crucial for ensuring data privacy, a major concern in many applications. However, FL faces significant challenges due to data and model heterogeneity arising from diverse learning environments and the varying capabilities of participating entities. Most existing methods primarily concentrate on aggregating knowledge that is represented by models, logits, or features. which rely on specific assumptions that may not hold in real-world scenarios and thus fail to address both data and model heterogeneity simultaneously. In this work, we aim to address these challenges by tackling heterogeneity from both model and data perspectives while maintaining efficiency. To this end, we leverage locally encoded latent prototypes produced from the local knowledge memory bank to represent per-client knowledge updates, which are then aggregated on the server and transferred back to the clients for knowledge decoding and integration as global constraints for further local training. Considering the heterogeneity in model architectures, we design the knowledge encoder and decoder to be compatible with different model architectures and ensure robust prototype aggregation by aligning latent spaces to a common prior distribution, to enhance compatibility under diverse data distributions. We evaluate our method on multiple benchmarks and demonstrate its superior performance in terms of accuracy and effectiveness under various heterogeneous settings.
