...
首页> 外文期刊>Electronic Notes in Theoretical Computer Science >Formal Verification of Differential Privacy for Interactive Systems (Extended Abstract)
【24h】

Formal Verification of Differential Privacy for Interactive Systems (Extended Abstract)

机译:交互式系统差异隐私的形式验证(扩展摘要)

获取原文
           

摘要

Differential privacy is a promising approach to privacy preserving data analysis with a well-developed theory for functions. Despite recent work on implementing systems that aim to provide differential privacy, the problem of formally verifying that these systems have differential privacy has not been adequately addressed. We develop a formal probabilistic automaton model of differential privacy for systems by adapting prior work on differential privacy for functions. We present the first sound verification technique for proving differential privacy of interactive systems. The technique is based on a form of probabilistic bisimulation relation. The novelty lies in the way we track quantitative privacy leakage bounds using a relation family instead of a single relation. We illustrate the proof technique on a representative automaton motivated by PINQ, an implemented system that is intended to provide differential privacy. Surprisingly, our analysis yields a privacy leakage bound of(2t??)rather than(t??)when?-differentially private functions are calledttimes. The extra leakage arises from accounting for bounded memory constraints of real computers.
机译:差异性隐私是一种使用成熟的功能理论进行隐私保护数据分析的有前途的方法。尽管最近在实现旨在提供差异性隐私的系统方面进行了工作,但是尚未充分解决形式验证这些系统具有差异性隐私的问题。我们通过适应功能差异隐私的先前工作,开发了系统差异隐私的形式化概率自动机模型。我们提出了第一个声音验证技术,用于证明交互式系统的差异性隐私。该技术基于概率双仿真关系的形式。新颖之处在于,我们使用关系族而不是单个关系来跟踪定量的隐私泄漏范围。我们说明了由PINQ所驱动的代表性自动机上的证明技术,PINQ是一种旨在提供差异隐私的已实施系统。出乎意料的是,我们的分析得出的私密性泄漏界为(2t?),而不是当?-微分私有函数称为t时间时的(t?)。额外的泄漏是由于考虑了实际计算机的有限内存约束而引起的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号