Welcome to Fedle, an open-source python library for federated learning. With the increasing use of deep-learning models in the industry & academia nowadays, developers are feeling the frustration of long & unwanted large training times. Traditional models like CNN, KNN, and MLP on large datasets like the CIFAR-10 often require lengthy training times and frequent accuracy improvements. Waiting for models to become usable can be frustrating and inefficient.
We've built an easy to use python library which has a central server & 10 clients feeded into the network. You can choose which model you wish to train from the choice of models available & you can also choose which dataset to let the clients train on your model. The clients run a complete epoch of the training set & updates the central server the parameters generated. The model aggregates the parameters shared by each client at the end of each traoning round according to a simple federated averaging algorithm & then sends back the updated models to the clients for the next round of training. Accuracy & loss are calculated at the end of each round of training to show the increase in accuracy & the decrease in loss after the completion of each training round.
After all the rounds of training are completed & the server has aggregated the features shared by all the clients, the final best model is now shared back to the user.
Yes, that's it!
All you had to do is a simple installation of the fedle library & the dependencies mentioned in the requirements.txt file.
Tracks Applied (6)
Polygon
Polygon
Replit
Quine
Vineet
Technologies used
Discussion