首页> 外文会议>IEEE International Conference on Cluster Computing >Optimizing Caching DSM for Distributed Software Speculation
【24h】

Optimizing Caching DSM for Distributed Software Speculation

机译:优化缓存DSM以进行分布式软件推测

获取原文

摘要

Clusters with caching DSMs deliver programmability and performance by supporting shared-memory programming and tolerate remote I/O latencies via caching. The input to a data parallel program is partitioned across the cluster while the DSM transparently fetches and caches remote data as needed. Irregular applications, however, are challenging to parallelize because the input related data dependences that manifest at runtime require use of speculation for correct parallel execution. By speculating that there are no input related cross iteration dependences, private copies of the input can be processed by parallelizing the loop, the absence of dependences is validated before committing the computed results. We show that while caching helps tolerate long communication latencies in irregular data-parallel applications, using a cached values in a computation can lead to misspeculation and thus aggressive caching can degrade performance due to increased misspeculation rate. We present optimizations for distributed speculation on caching based DSMs that decrease the cost of misspeculation check and speed up the re-execution of misspeculated recomputations. Optimized distributed speculation achieves speedups of 2.24x for coloring, 1.71x for connected components, 1.88x for community detection, 1.32x for shortest path, and 1.74x for pagerank over unoptimized speculation.
机译:带有缓存DSM的群集通过支持共享内存编程来提供可编程性和性能,并通过缓存容忍远程I / O延迟。数据并行程序的输入在整个群集中进行分区,而DSM则根据需要透明地获取和缓存远程数据。但是,不规则的应用程序很难并行化,因为在运行时表现出的与输入相关的数据相关性需要使用推测才能正确地并行执行。通过推测不存在与输入相关的交叉迭代依存关系,可以通过并行化循环来处理输入的私有副本,在提交计算结果之前先对依存关系的存在进行验证。我们显示,虽然缓存有助于在不规则数据并行应用程序中忍受较长的通信等待时间,但在计算中使用缓存的值可能会导致错误推测,因此,由于错误推测率增加,主动缓存会降低性能。我们提出了基于缓存的DSM上的分布式推测优化,可降低错误推测检查的成本并加快错误推测重新计算的重新执行速度。经过优化的分布式推测,与未优化的推测相比,可将着色速度提高2.24倍,将连接的组件速度提高1.71倍,对社区进行检测的速度提高1.88倍,对最短路径的速度提高1.32倍,对页面排名提高1.74倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号