Skip to content
This repository has been archived by the owner on Nov 28, 2024. It is now read-only.

Enhancing the Flower framework with isolated, client-side training and streamlined checkpoint management for secure and efficient federated learning.

License

Notifications You must be signed in to change notification settings

AierLab/flower-fl-local-training-isolation-framework

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🌸 Flower-FL-Local-Training-Isolation-Framework 🌸

Welcome to the AIerLab's groundbreaking Federated Learning repository! We are a group of geeks committed to making AI safe and friendly for everyone. Our setup integrates complex training methodologies seamlessly, paving the way for the next generation of machine learning solutions without impinging on the native Flower framework logic.

License: Apache-2.0 PRs Welcome Python 3.6+ Flower Framework Join our Discord AIerLab

💻 Setting Up Your Development Environment

Before you dive into our repository, ensure your system is equipped with the necessary frameworks to facilitate a seamless experience:

1. 🌺 Flower Framework Installation

Set up the Flower framework as the foundational element for federated learning by executing the following command in your terminal:

python -m pip install flwr 

2. 🕯️ PyTorch Installation

Next, install PyTorch, the powerful library facilitating machine learning and deep learning implementations. Follow the official guidelines for a smooth installation:

python -m pip install torch torchvision

🚀 Running the Simulation

Our innovative approach segregates the model training process from the Flower framework, allowing the seamless integration of complex training methodologies without disruptions. Here's your guide to executing the simulation:

1. 🖥️ Server Startup

Navigate to the project directory and initialize the server with this command:

cd path/to/the/folder
python server_main.py

2. 📱 Client Main Activation

In separate terminals, initiate the client main instances sequentially, assigning them respective ranks:

cd path/to/the/folder
python client_main.py --rank 1
cd path/to/the/folder
python client_main.py --rank 2

3. 🏭 Model Main Activation

Simultaneously, launch the model training threads in distinct terminals, assigning them the respective ranks:

cd path/to/the/folder
python model_main.py --rank 1
cd path/to/the/folder
python model_main.py --rank 2

This establishes a standard simulation setup orchestrating five concurrent terminals: two hosting client_main.py, two accommodating model_main.py, and one dedicated to server_main.py.

💡 Explore the wealth of configuration options with the -h or --help command-line arguments.

🌈 Features and Advancements

Dive into our enriched repository that offers:

  • 🎓 Detached Training Process: Our system separates the training process from the Flower framework, paving the way for intricate training schemes to function harmoniously without interfering with the core logic.

  • 📦 Localized State Dictionary Communication: We employ local state dictionaries as a mode of internal communication, fostering a ground where sophisticated training techniques can be employed with ease and efficiency.

  • 💾 Checkpoint Management: Our setup facilitates the efficient management of checkpoints, both server-side and client-side, instilling a resilient learning protocol.

  • 📈 Adaptable Aggregation Functionality: Customize the aggregation functions to cater to your split learning requirements, facilitating an optimized learning workflow crafted for specific project needs.

We're continually enhancing our repository, so stay tuned for new, dynamic features!


Join us in this endeavor as we aim to redefine federated learning protocols, fostering an environment where complexity meets simplicity, paving the path for the next generation of machine learning solutions. Become a part of this revolutionary journey and connect with fellow enthusiasts in our AIerLab Discord community. Thank you for being a part of this transformative journey!

About

Enhancing the Flower framework with isolated, client-side training and streamlined checkpoint management for secure and efficient federated learning.

Topics

Resources

License

Stars

Watchers

Forks

Languages