太阳成集团tyc411(中国)有限公司-百度百科

太阳成集团tyc411

Data Compression in Distributed Learning

来源:太阳成集团tyc411 发布时间:2021-06-10   280

报告人:Prof.Ming Yan(Michigan State University)

时间:2021年6月10日 (周四)13:30-14:30

地点: 工商楼200-9会议室

内容简介:Large-scale machine learning models are trained by parallel stochastic gradient descent algorithms on distributed or decentralized systems. The communications for gradient aggregation and model synchronization become the major obstacles for efficient learning as the number of nodes and the model's dimension scale up. In this talk, I will introduce several ways to compress the transferred data and reduce the overall communication such that the obstacles can be immensely mitigated.


 

欢迎各位老师同学参加。

联系人:鲁汪涛研究员 (wangtaolu@zju.edu.cn)


Copyright © 2023 太阳成集团tyc411(中国)有限公司-百度百科    版权所有

    浙ICP备05074421号

技术支持: 创高软件     管理登录

    您是第 1000 位访问者