Skip to content
/ MHFL Public

This is an evaluation work for model heterogeneous federated learning

License

Notifications You must be signed in to change notification settings

zza234s/MHFL

Repository files navigation

Toward Better Model Heterogeneous Federated Learning: A Benchmark and Evaluation Work

This is an evaluation work for Model Heterogeneous Federated Learning (MHFL). It also aims to provide an easy-to-use MHFL benchmark.

这是一个模型异构联邦学习的评估性工作, 同时旨在提供一个易于使用的模型异构联邦学习基准。

The code of this project is under construction.

项目的代码正在完善中~~

Acknowledgment

All code implementations are based on the FederatedScope V0.3.0: https://github.com/alibaba/FederatedScope

We are grateful for their outstanding work.

Currently Supported Algorithms

Basic baseline

  • LOCAL: each client only performs local training without the FL process.

Methods based on knowledge distillation (with public dataset)

Abbreviation Title Venue Materials
FedMD FedMD: Heterogenous Federated Learning via Model Distillation NeurIPS 2019 Workshop [pub] [repository]
FSFL Few-Shot Model Agnostic Federated Learning ACM MM 2022 [pub] [repository]
FCCL Learn from Others and Be Yourself in Heterogeneous Federated Learning CVPR 2022 [pub] [repository]

Methods without public dataset

Abbreviation Title Venue Materials
FML Federated Mutual Learning ArXiv 2020 [pub] [repository]
FedHeNN Architecture Agnostic Federated Learning for Neural Networks ICML 2022 [pub]
FedProto FedProto: Federated Prototype Learning across Heterogeneous Clients AAAI 2022 [pub] [repository]
FedPCL Federated Learning from Pre-Trained Models: A Contrastive Learning Approach NeurIPS 2022 [pub] [repository]
FedGH FedGH: Heterogeneous Federated Learning with Generalized Global Header ACM MM 2023 [pub] [repository]

Models & Dataset

Model setting

We hope to test whether the above methods work well under different model heterogeneous setups. To this end, we conduct experiments with the following two settings.

  1. Low model heterogeneity: We used five CNN models that are a bit different in terms of the number of channels and layers.
  2. High model heterogeneity: We tried five models with large differences, including MLP, CNN, ResNet, and wide ResNet.

For details of the model architecture, please refer to model setting files and model definition files

Dataset

Currently, we conduct experiments on three benchmark datasets: CIFAR-10, SVHN, and office-10.

Quickly Start

Step 1. Install FederatedScope

Users need to clone the source code and install FederatedScope (we suggest python version >= 3.9).

  • clone the source code
git clone https://github.com/zza234s/MHFL
cd MHFL
  • install the required packages:
conda create -n fs python=3.9
conda activate fs

# install pytorch
conda install -y pytorch=1.10.1 torchvision=0.11.2 torchaudio=0.10.1 torchtext=0.11.1 cudatoolkit=11.3 -c pytorch -c conda-forge

# install some extra dependencies
conda install -y pyg -c pyg
conda install -y nltk
pip install rdkit
pip install ipdb
pip install kornia
pip install timm
  • Next, after the required packages is installed, you can install FederatedScope from source:
pip install -e .[dev]

Step 2. Run Algorithm (Take running FedProto as an example)

  • Enter the "federatedscope" folder
cd federatedscope
  • Run the script
python main.py --cfg model_heterogeneity/methods/FedProto/FedProto_on_cifar10.yaml --client_cfg model_heterogeneity/model_settings/model_setting_CV_low_heterogeneity.yaml

PS

Please feel free to contact me.

My email is: hanlinzhou@zjut.edu.cn

My WeChat is: poipoipoi8886

About

This is an evaluation work for model heterogeneous federated learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published