You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repository provides code for communication-efficient decentralized ML training (both deep learning, compatible with PyTorch, and traditional convex machine learning models.
We provide code for the main experiments in the papers
Please refer to the folders convex_code and dl_code for more details.
References
If you use the code, please cite the following papers:
@inproceedings{koloskova2019choco,
title = {Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication},
author = {Anastasia Koloskova and Sebastian U. Stich and Martin Jaggi},
booktitle = {ICML 2019 - Proceedings of the 36th International Conference on Machine Learning},
url = {https://proceedings.mlr.press/v97/koloskova19a.html},
publisher = {PMLR},
volume = {97},
pages = {3479--3487},
year = {2019}
}
and
@inproceedings{koloskova2020decentralized,
title={Decentralized Deep Learning with Arbitrary Communication Compression},
author={Anastasia Koloskova* and Tao Lin* and Sebastian U Stich and Martin Jaggi},
booktitle={ICLR 2020 - International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=SkgGCkrKvH}
}