Data Compression in Distributed Learning
报告人:Prof.Ming Yan(Michigan State University)
时间:2021年6月10日 (周四)13:30-14:30
地点: 工商楼200-9会议室
内容简介:Large-scale machine learning models are trained by parallel stochastic gradient descent algorithms on distributed or decentralized systems. The communications for gradient aggregation and model synchronization become the major obstacles for efficient learning as the number of nodes and the model's dimension scale up. In this talk, I will introduce several ways to compress the transferred data and reduce the overall communication such that the obstacles can be immensely mitigated.
欢迎各位老师同学参加。
联系人:鲁汪涛研究员 (wangtaolu@zju.edu.cn)