...
首页> 外文期刊>IEEE Journal on Selected Areas in Communications >Laplacian Matrix Sampling for Communication- Efficient Decentralized Learning
【24h】

Laplacian Matrix Sampling for Communication- Efficient Decentralized Learning

机译:Laplacian Matrix Sampling for Communication- Efficient Decentralized Learning

获取原文
获取原文并翻译 | 示例
           

摘要

We consider the problem of training a given machine learning model by decentralized parallel stochastic gradient descent over training data distributed across multiple nodes, which arises in many application scenarios. Although extensive studies have been conducted on improving the communication efficiency by optimizing what to communicate between nodes (e.g., model compression) and how often to communicate, recent studies have shown that it is also important to customize the communication patterns between each pair of nodes, which is the focus of this work. To this end, we propose a framework and efficient algorithms to design the communication patterns through Laplacian matrix sampling (LMS), which governs not only which nodes should communicate with each other but also what weights the communicated parameters should carry during parameter aggregation. Our framework is designed to minimize the total cost incurred until convergence based on any given cost model that is additive over iterations, with focus on minimizing the communication cost. Besides achieving a theoretically guaranteed performance in the special case of additive homogeneous communication costs, our solution also achieves superior performance under a variety of network settings and cost models in experiments based on real datasets and topologies, saving 24–50 of the cost compared to the state-of-the-art design without compromising the quality of the trained model.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号