更改

跳到导航 跳到搜索
添加29,873字节 、 2021年11月13日 (六) 08:30
无编辑摘要
第235行: 第235行:  
这可以推广到群组体隐私,因为群体大小可以被视为<math>A</math>和<math>B</math>之间的汉明距离<math>h</math>(其中<math>A</math>包含群组,而<math>B</math>没有)。在这种情况下,<math>M\circ T</math>是<math>(\epsilon \times c \times h)</math>-差分隐私的。
 
这可以推广到群组体隐私,因为群体大小可以被视为<math>A</math>和<math>B</math>之间的汉明距离<math>h</math>(其中<math>A</math>包含群组,而<math>B</math>没有)。在这种情况下,<math>M\circ T</math>是<math>(\epsilon \times c \times h)</math>-差分隐私的。
    +
==Adoption of differential privacy in real-world applications==
 +
{{see also|Implementations of differentially private analyses}}
 +
Several uses of differential privacy in practice are known to date:
 +
*2008: [[United States Census Bureau|U.S. Census Bureau]], for showing commuting patterns.<ref name="MachanavajjhalaKAGV08" />
 +
* 2014: [[Google]]'s RAPPOR, for telemetry such as learning statistics about unwanted software hijacking users' settings. <ref name="RAPPOR" /><ref>{{Citation|title=google/rappor|date=2021-07-15|url=https://github.com/google/rappor|publisher=GitHub}}</ref>
 +
*2015: Google, for sharing historical traffic statistics.<ref name="Eland" />
 +
*2016: [[Apple Inc.|Apple]] announced its intention to use differential privacy in [[iOS 10]] to improve its [[Intelligent personal assistant]] technology.<ref>{{cite web|title=Apple - Press Info - Apple Previews iOS 10, the Biggest iOS Release Ever|url=https://www.apple.com/pr/library/2016/06/13Apple-Previews-iOS-10-The-Biggest-iOS-Release-Ever.html|website=Apple|access-date=16 June 2016}}</ref>
 +
*2017: Microsoft, for telemetry in Windows.<ref name="DpWinTelemetry" />
 +
*2019: Privitar Lens is an API using differential privacy.<ref>{{cite web|title=Privitar Lens|url=https://www.privitar.com/privitar-lens|access-date=20 February 2018}}</ref>
 +
* 2020: LinkedIn, for advertiser queries.<ref name="DpLinkedIn" />
    +
 +
Several uses of differential privacy in practice are known to date:
 +
*2008: U.S. Census Bureau, for showing commuting patterns.
 +
*2014: Google's RAPPOR, for telemetry such as learning statistics about unwanted software hijacking users' settings.
 +
*2015: Google, for sharing historical traffic statistics.
 +
*2016: Apple announced its intention to use differential privacy in iOS 10 to improve its Intelligent personal assistant technology.
 +
*2017: Microsoft, for telemetry in Windows.
 +
*2019: Privitar Lens is an API using differential privacy.
 +
*2020: LinkedIn, for advertiser queries.
 +
 +
2008: u.s. Census Bureau,for shows comforting patterns. 在实践中,差分隐私的几个用途已经为人所知:
 +
*2008: 美国人口普查局,显示通勤模式。
 +
*2014年: 谷歌的 RAPPOR,用于遥测,例如了解不受欢迎的软件劫持用户设置的统计数据。2015: Google,分享历史流量统计数据。
 +
*2016年: 苹果公司宣布打算在 iOS 10中使用差分隐私智能个人助理来改进其智能个人助理技术。
 +
*2017: 微软,Windows 遥测系统。2019: priveritar Lens 是一个使用差分隐私的 API。2020: LinkedIn,for advertiser queries.
 +
 +
==Public purpose considerations ==
 +
There are several public purpose considerations regarding differential privacy that are important to consider, especially for policymakers and policy-focused audiences interested in the social opportunities and risks of the technology:<ref>{{Cite web|title=Technology Factsheet: Differential Privacy|url=https://www.belfercenter.org/publication/technology-factsheet-differential-privacy|access-date=2021-04-12|website=Belfer Center for Science and International Affairs|language=en}}</ref>
 +
 +
There are several public purpose considerations regarding differential privacy that are important to consider, especially for policymakers and policy-focused audiences interested in the social opportunities and risks of the technology:
 +
 +
关于差分隐私技术,有几个公共目的方面的考虑是需要考虑的,特别是对于那些对技术的社会机遇和风险感兴趣的决策者和政策关注的受众:
 +
 +
*'''Data Utility & Accuracy.''' The main concern with differential privacy is the tradeoff between data utility and individual privacy. If the privacy loss parameter is set to favor utility, the privacy benefits are lowered (less “noise” is injected into the system); if the privacy loss parameter is set to favor heavy privacy, the accuracy and utility of the dataset are lowered (more “noise” is injected into the system). It is important for policymakers to consider the tradeoffs posed by differential privacy in order to help set appropriate best practices and standards around the use of this privacy preserving practice, especially considering the diversity in organizational use cases. It is worth noting, though, that decreased accuracy and utility is a common issue among all statistical disclosure limitation methods and is not unique to differential privacy. What is unique, however, is how policymakers, researchers, and implementers can consider mitigating against the risks presented through this tradeoff.
 +
 +
* Data Utility & Accuracy. The main concern with differential privacy is the tradeoff between data utility and individual privacy. If the privacy loss parameter is set to favor utility, the privacy benefits are lowered (less “noise” is injected into the system); if the privacy loss parameter is set to favor heavy privacy, the accuracy and utility of the dataset are lowered (more “noise” is injected into the system). It is important for policymakers to consider the tradeoffs posed by differential privacy in order to help set appropriate best practices and standards around the use of this privacy preserving practice, especially considering the diversity in organizational use cases. It is worth noting, though, that decreased accuracy and utility is a common issue among all statistical disclosure limitation methods and is not unique to differential privacy. What is unique, however, is how policymakers, researchers, and implementers can consider mitigating against the risks presented through this tradeoff.
 +
 +
 +
*数据的实用性及准确性。差分隐私的主要关注点在于数据效用和个人隐私之间的权衡。如果将隐私损失参数设置为有利于实用性,则隐私好处降低(向系统中注入的“噪音”较少) ; 如果将隐私损失参数设置为有利于重隐私性,则数据集的准确性和实用性降低(向系统中注入更多的“噪音”)。对于决策者来说,重要的是要考虑到差分隐私的权衡,以帮助建立适当的最佳实践和标准来使用这种隐私保护实践,特别是考虑到组织用例的多样性。值得注意的是,在所有的统计披露限制方法中,降低准确性和效用是一个共同的问题,并不是差分隐私唯一的。然而,独特之处在于,决策者、研究人员和实施者可以考虑如何减轻这种权衡带来的风险。
 +
 +
*'''Data Privacy & Security.''' Differential privacy provides a quantified measure of privacy loss and an upper bound and allows curators to choose the explicit tradeoff between privacy and accuracy. It is robust to still unknown privacy attacks. However, it encourages greater data sharing, which if done poorly, increases privacy risk. Differential privacy implies that privacy is protected, but this depends very much on the privacy loss parameter chosen and may instead lead to a false sense of security. Finally, though it is robust against unforeseen future privacy attacks, a countermeasure may be devised that we cannot predict.
 +
 +
*Data Privacy & Security. Differential privacy provides a quantified measure of privacy loss and an upper bound and allows curators to choose the explicit tradeoff between privacy and accuracy. It is robust to still unknown privacy attacks. However, it encourages greater data sharing, which if done poorly, increases privacy risk. Differential privacy implies that privacy is protected, but this depends very much on the privacy loss parameter chosen and may instead lead to a false sense of security. Finally, though it is robust against unforeseen future privacy attacks, a countermeasure may be devised that we cannot predict.
 +
 +
 +
*资料私隐及保安。差分隐私图书馆提供了一个量化的隐私损失度量和上限,并允许馆长在隐私和准确性之间做出明确的权衡。它对仍然未知的隐私攻击是健壮的。然而,它鼓励更大的数据共享,如果做得不好,会增加隐私风险。差分隐私意味着隐私是受到保护的,但这在很大程度上取决于选择的隐私损失参数,并可能会导致错误的安全感。最后,尽管它对未来不可预见的隐私攻击是健壮的,但可以设计出一种我们无法预测的对策。
 +
 +
==See also==
 +
*[[Quasi-identifier]]
 +
*[[Exponential mechanism (differential privacy)]] – a technique for designing differentially private algorithms
 +
*[[k-anonymity]]
 +
*[[Differentially private analysis of graphs]]
 +
*[[Protected health information]]
 +
 +
*Quasi-identifier
 +
* Exponential mechanism (differential privacy) – a technique for designing differentially private algorithms
 +
*k-anonymity
 +
*Differentially private analysis of graphs
 +
*Protected health information
 +
 +
 +
*准标识符
 +
*指数机制(差分隐私)-一种设计不同私有算法的技术
 +
* k-匿名
 +
*图的不同私有分析
 +
*受保护的健康信息
 +
 +
==References==
 +
{{Reflist|refs=
 +
<ref name="DKMMN06">
 +
Dwork, Cynthia, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. "Our data, ourselves: Privacy via distributed noise generation." In Advances in Cryptology-EUROCRYPT 2006, pp. 486–503. Springer Berlin Heidelberg, 2006.</ref>
 +
 +
{{Reflist|refs=
 +
 +
Dwork, Cynthia, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. "Our data, ourselves: Privacy via distributed noise generation." In Advances in Cryptology-EUROCRYPT 2006, pp. 486–503. Springer Berlin Heidelberg, 2006.
 +
 +
{{Reflist|refs=
 +
 +
Dwork, Cynthia, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor.“我们的数据,我们自己: 通过分布式噪音产生的隐私。”密码学的进展-eurocrypt 2006,第页。486–503.Springer Berlin Heidelberg,2006年。
 +
 +
<!-- unused refs  <ref name="CABP13">
 +
Chatzikokolakis, Konstantinos, Miguel E. Andrés, Nicolás Emilio Bordenabe, and Catuscia Palamidessi. "Broadening the scope of Differential Privacy using metrics." In Privacy Enhancing Technologies, pp. 82–102. Springer Berlin Heidelberg, 2013.</ref>
 +
 +
<ref name="HRW11">
 +
Hall, Rob, Alessandro Rinaldo, and Larry Wasserman. "Random differential privacy." arXiv preprint arXiv:1112.2680 (2011).</ref>-->
 +
 +
 +
Hall, Rob, Alessandro Rinaldo, and Larry Wasserman. "Random differential privacy." arXiv preprint arXiv:1112.2680 (2011).-->
 +
 +
霍尔、罗布、亚历山德罗 · 里纳尔多和拉里 · 沃瑟曼。随机差分隐私arXiv preprint arXiv:1112.2680 (2011).-->
 +
 +
<ref name="MachanavajjhalaKAGV08">
 +
Ashwin Machanavajjhala, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber. "Privacy: Theory meets Practice on the Map". In Proceedings of the 24th International Conference on Data Engineering, ICDE) 2008.</ref>
 +
 +
 +
Ashwin Machanavajjhala, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber. "Privacy: Theory meets Practice on the Map". In Proceedings of the 24th International Conference on Data Engineering, ICDE) 2008.
 +
 +
 +
Ashwin Machanavajjhala, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber.“隐私权: 理论与实践相结合的地图”。2008年第24届国际数据工程会议论文集。
 +
 +
<ref name="RAPPOR">
 +
Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova. [https://dl.acm.org/doi/10.1145/2660267.2660348 "RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response".] In Proceedings of the 21st ACM Conference on Computer and Communications Security (CCS), 2014. {{doi|10.1145/2660267.2660348}}</ref>
 +
 +
 +
Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova. "RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response". In Proceedings of the 21st ACM Conference on Computer and Communications Security (CCS), 2014.
 +
 +
 +
Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova.“ RAPPOR: 随机可聚合隐私保护顺序响应”。2014年美国计算机与通信安全会议论文集。
 +
 +
<ref name="DMNS06">
 +
[https://link.springer.com/chapter/10.1007%2F11681878_14 Calibrating Noise to Sensitivity in Private Data Analysis] by Cynthia Dwork, Frank McSherry, Kobbi Nissim, Adam Smith. In Theory of Cryptography Conference (TCC), Springer, 2006. {{doi|10.1007/11681878_14}}. The [https://journalprivacyconfidentiality.org/index.php/jpc/article/view/405 full version] appears in Journal of Privacy and Confidentiality, 7 (3), 17-51. {{doi|10.29012/jpc.v7i3.405}}</ref>
 +
 +
 +
Calibrating Noise to Sensitivity in Private Data Analysis by Cynthia Dwork, Frank McSherry, Kobbi Nissim, Adam Smith. In Theory of Cryptography Conference (TCC), Springer, 2006. . The full version appears in Journal of Privacy and Confidentiality, 7 (3), 17-51.
 +
 +
在私人数据分析中校准噪声的灵敏度。《密码学理论》(TCC) ,Springer,2006。完整版本发表在《隐私与保密期刊》 ,7(3) ,17-51。
 +
 +
<ref name="PINQ">
 +
[http://research.microsoft.com/pubs/80218/sigmod115-mcsherry.pdf Privacy integrated queries: an extensible platform for privacy-preserving data analysis] by Frank D. McSherry. In Proceedings of the 35th SIGMOD International Conference on Management of Data (SIGMOD), 2009. {{doi|10.1145/1559845.1559850}}</ref>
 +
 +
 +
Privacy integrated queries: an extensible platform for privacy-preserving data analysis by Frank D. McSherry. In Proceedings of the 35th SIGMOD International Conference on Management of Data (SIGMOD), 2009.
 +
 +
隐私集成查询: 一个可扩展的隐私保护数据分析平台。在第35届 SIGMOD 国际数据管理会议论文集中,2009年。
 +
 +
<ref name="Dwork, ICALP 2006">
 +
[http://research.microsoft.com/pubs/64346/dwork.pdf Differential Privacy] by Cynthia Dwork, International Colloquium on Automata, Languages and Programming (ICALP) 2006, p.&nbsp;1–12. {{doi|10.1007/11787006 1}}</ref>
 +
 +
 +
Differential Privacy by Cynthia Dwork, International Colloquium on Automata, Languages and Programming (ICALP) 2006, p. 1–12.
 +
 +
差分隐私,2006年自动机,语言和编程国际座谈会,第1-12页。
 +
 +
<ref name="DPBook">
 +
[http://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf The Algorithmic Foundations of Differential Privacy] by Cynthia Dwork and Aaron Roth. Foundations and Trends in Theoretical Computer Science. Vol. 9, no. 3–4, pp.&nbsp;211‐407, Aug. 2014. {{doi|10.1561/0400000042}}</ref>
 +
 +
 +
The Algorithmic Foundations of Differential Privacy by Cynthia Dwork and Aaron Roth. Foundations and Trends in Theoretical Computer Science. Vol. 9, no. 3–4, pp. 211‐407, Aug. 2014.
 +
 +
差分隐私的算法基础》 by Cynthia Dwork 和 Aaron Roth。理论计算机科学的基础与发展趋势。第一卷。9号,不。3-4页。211‐407, Aug. 2014.
 +
 +
<ref name="Eland">[https://europe.googleblog.com/2015/11/tackling-urban-mobility-with-technology.html Tackling Urban Mobility with Technology] by Andrew Eland. Google Policy Europe Blog, Nov 18, 2015.</ref>
 +
 +
Tackling Urban Mobility with Technology by Andrew Eland. Google Policy Europe Blog, Nov 18, 2015.
 +
 +
利用技术解决城市流动性问题。谷歌政策欧洲博客,2015年11月18日。
 +
 +
<ref name="DpWinTelemetry">[https://www.microsoft.com/en-us/research/publication/collecting-telemetry-data-privately/ Collecting telemetry data privately] by Bolin Ding, Jana Kulkarni, Sergey Yekhanin. NIPS 2017.</ref>
 +
 +
Collecting telemetry data privately by Bolin Ding, Jana Kulkarni, Sergey Yekhanin. NIPS 2017.
 +
 +
私下收集遥测数据,由 Bolin Ding,Jana Kulkarni,Sergey Yekhanin。2017年。
 +
 +
<ref name="DpLinkedIn">[https://arxiv.org/abs/2002.05839 LinkedIn's Audience Engagements API: A Privacy Preserving Data Analytics System at Scale] by Ryan Rogers, Subbu Subramaniam, Sean Peng, David Durfee, Seunghyun Lee, Santosh Kumar Kancha, Shraddha Sahay, Parvez Ahammad. arXiv:2002.05839.</ref>
 +
 +
LinkedIn's Audience Engagements API: A Privacy Preserving Data Analytics System at Scale by Ryan Rogers, Subbu Subramaniam, Sean Peng, David Durfee, Seunghyun Lee, Santosh Kumar Kancha, Shraddha Sahay, Parvez Ahammad. arXiv:2002.05839.
 +
 +
的受众接触 API: 一个大规模的隐私保护数据分析系统作者: Ryan Rogers,Subbu Subramaniam,Sean Peng,David Durfee,Seunghyun Lee,Santosh Kumar Kancha,Shraddha Sahay,Parvez Ahammad。2002.05839.
 +
 +
<ref name="DP19">[https://arxiv.org/abs/1906.01337 SoK: Differential Privacies] by Damien Desfontaines, Balázs Pejó. 2019.</ref>
 +
}}
 +
 +
SoK: Differential Privacies by Damien Desfontaines, Balázs Pejó. 2019.
 +
}}
 +
 +
差异私生活作者: Damien Desfontaines,Balázs pej ó。2019.
 +
}}
 +
 +
==Further reading==
 +
*[https://desfontain.es/privacy/index.html A reading list on differential privacy]
 +
*[https://journalprivacyconfidentiality.org/index.php/jpc/article/view/404 Abowd, John. 2017. “How Will Statistical Agencies Operate When All Data Are Private?”. Journal of Privacy and Confidentiality 7 (3).] {{doi|10.29012/jpc.v7i3.404}} ([https://www2.census.gov/cac/sac/meetings/2017-09/role-statistical-agency.pdf slides])
 +
*[http://www.jetlaw.org/wp-content/uploads/2018/12/4_Wood_Final.pdf "Differential Privacy: A Primer for a Non-technical Audience"], Kobbi Nissim, Thomas Steinke, Alexandra Wood, [[Micah Altman]], Aaron Bembenek, Mark Bun, Marco Gaboardi, David R. O’Brien, and Salil Vadhan, Harvard Privacy Tools Project, February 14, 2018
 +
* Dinur, Irit and Kobbi Nissim. 2003. Revealing information while preserving privacy. In Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems(PODS '03). ACM, New York, NY, USA, 202-210. {{doi|10.1145/773153.773173}}.
 +
*Dwork, Cynthia, Frank McSherry, Kobbi Nissim, and Adam Smith. 2006. in Halevi, S. & Rabin, T. (Eds.) Calibrating Noise to Sensitivity in Private Data Analysis Theory of Cryptography: Third Theory of Cryptography Conference, TCC 2006, New York, NY, USA, March 4–7, 2006. Proceedings, Springer Berlin Heidelberg, 265-284, {{doi|10.1007/11681878 14}}.
 +
*Dwork, Cynthia. 2006. Differential Privacy, 33rd International Colloquium on Automata, Languages and Programming, part II (ICALP 2006), Springer Verlag, 4052, 1-12, {{ISBN|3-540-35907-9}}.
 +
*Dwork, Cynthia and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy. Foundations and Trends in Theoretical Computer Science. Vol. 9, Nos. 3–4. 211–407, {{doi|10.1561/0400000042}}.
 +
*Machanavajjhala, Ashwin, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber. 2008. Privacy: Theory Meets Practice on the Map, International Conference on Data Engineering (ICDE) 2008: 277-286, {{doi|10.1109/ICDE.2008.4497436}}.
 +
*Dwork, Cynthia and Moni Naor. 2010. On the Difficulties of Disclosure Prevention in Statistical Databases or The Case for Differential Privacy, Journal of Privacy and Confidentiality: Vol. 2: Iss. 1, Article 8. Available at: http://repository.cmu.edu/jpc/vol2/iss1/8.
 +
*Kifer, Daniel and Ashwin Machanavajjhala. 2011. No free lunch in data privacy. In Proceedings of the 2011 ACM SIGMOD International Conference on Management of data (SIGMOD '11). ACM, New York, NY, USA, 193-204. {{doi|10.1145/1989323.1989345}}.
 +
*Erlingsson, Úlfar, Vasyl Pihur and Aleksandra Korolova. 2014. RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS '14). ACM, New York, NY, USA, 1054-1067. {{doi|10.1145/2660267.2660348}}.
 +
*Abowd, John M. and Ian M. Schmutte. 2017 . Revisiting the economics of privacy: Population statistics and confidentiality protection as public goods. Labor Dynamics Institute, Cornell University, Labor Dynamics Institute, Cornell University, at https://digitalcommons.ilr.cornell.edu/ldi/37/
 +
*Abowd, John M. and Ian M. Schmutte. Forthcoming. An Economic Analysis of Privacy Protection and Statistical Accuracy as Social Choices. American Economic Review,  {{arxiv|1808.06303}}
 +
*Apple, Inc. 2016. Apple previews iOS 10, the biggest iOS release ever. Press Release (June 13). https://www.apple.com/newsroom/2016/06/apple-previews-ios-10-biggest-ios-release-ever.html.
 +
*Ding, Bolin, Janardhan Kulkarni, and Sergey Yekhanin 2017. Collecting Telemetry Data Privately, NIPS 2017.
 +
*http://www.win-vector.com/blog/2015/10/a-simpler-explanation-of-differential-privacy/
 +
*Ryffel, Theo, Andrew Trask, et. al. [[arxiv:1811.04017|"A generic framework for privacy preserving deep learning"]]
 +
 +
*A reading list on differential privacy
 +
*Abowd, John. 2017. “How Will Statistical Agencies Operate When All Data Are Private?”. Journal of Privacy and Confidentiality 7 (3).  (slides)
 +
*"Differential Privacy: A Primer for a Non-technical Audience", Kobbi Nissim, Thomas Steinke, Alexandra Wood, Micah Altman, Aaron Bembenek, Mark Bun, Marco Gaboardi, David R. O’Brien, and Salil Vadhan, Harvard Privacy Tools Project, February 14, 2018
 +
* Dinur, Irit and Kobbi Nissim. 2003. Revealing information while preserving privacy. In Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems(PODS '03). ACM, New York, NY, USA, 202-210. .
 +
*Dwork, Cynthia, Frank McSherry, Kobbi Nissim, and Adam Smith. 2006. in Halevi, S. & Rabin, T. (Eds.) Calibrating Noise to Sensitivity in Private Data Analysis Theory of Cryptography: Third Theory of Cryptography Conference, TCC 2006, New York, NY, USA, March 4–7, 2006. Proceedings, Springer Berlin Heidelberg, 265-284, .
 +
*Dwork, Cynthia. 2006. Differential Privacy, 33rd International Colloquium on Automata, Languages and Programming, part II (ICALP 2006), Springer Verlag, 4052, 1-12, .
 +
*Dwork, Cynthia and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy. Foundations and Trends in Theoretical Computer Science. Vol. 9, Nos. 3–4. 211–407, .
 +
* Machanavajjhala, Ashwin, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber. 2008. Privacy: Theory Meets Practice on the Map, International Conference on Data Engineering (ICDE) 2008: 277-286, .
 +
*Dwork, Cynthia and Moni Naor. 2010. On the Difficulties of Disclosure Prevention in Statistical Databases or The Case for Differential Privacy, Journal of Privacy and Confidentiality: Vol. 2: Iss. 1, Article 8. Available at: http://repository.cmu.edu/jpc/vol2/iss1/8.
 +
*Kifer, Daniel and Ashwin Machanavajjhala. 2011. No free lunch in data privacy. In Proceedings of the 2011 ACM SIGMOD International Conference on Management of data (SIGMOD '11). ACM, New York, NY, USA, 193-204. .
 +
*Erlingsson, Úlfar, Vasyl Pihur and Aleksandra Korolova. 2014. RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS '14). ACM, New York, NY, USA, 1054-1067. .
 +
*Abowd, John M. and Ian M. Schmutte. 2017 . Revisiting the economics of privacy: Population statistics and confidentiality protection as public goods. Labor Dynamics Institute, Cornell University, Labor Dynamics Institute, Cornell University, at https://digitalcommons.ilr.cornell.edu/ldi/37/
 +
*Abowd, John M. and Ian M. Schmutte. Forthcoming. An Economic Analysis of Privacy Protection and Statistical Accuracy as Social Choices. American Economic Review,
 +
*Apple, Inc. 2016. Apple previews iOS 10, the biggest iOS release ever. Press Release (June 13). https://www.apple.com/newsroom/2016/06/apple-previews-ios-10-biggest-ios-release-ever.html.
 +
*Ding, Bolin, Janardhan Kulkarni, and Sergey Yekhanin 2017. Collecting Telemetry Data Privately, NIPS 2017.
 +
*http://www.win-vector.com/blog/2015/10/a-simpler-explanation-of-differential-privacy/
 +
*Ryffel, Theo, Andrew Trask, et. al. "A generic framework for privacy preserving deep learning"
 +
 +
差分隐私上的阅读清单。2017.“当所有数据都是私人数据时,统计机构将如何运作?”。隐私与保密期刊7(3)。(幻灯片)
 +
*“差分隐私: 非技术观众入门”,Kobbi Nissim,Thomas Steinke,Alexandra Wood,Micah Altman,Aaron Bembenek,Mark Bun,Marco gabordi,David r. o’brien,and Salil Vadhan,Harvard Privacy Tools Project,February 14,2018
 +
*Dinur,Irit and Kobbi Nissim。2003.在保护隐私的同时披露信息。在第二十二届 ACM SIGMOD-SIGACT-SIGART 数据库系统原理研讨会会议录(PODS’03)。ACM,纽约,纽约,美国,202-210. 。
 +
*Dwork、 Cynthia、 Frank McSherry、 Kobbi Nissim 和 Adam Smith。2006. in Halevi,s & Rabin,t.(Eds.)在密码学的私人数据分析理论中校准噪声的灵敏度: 第三次密码学理论会议,TCC 2006,纽约,纽约,美国,2006年3月4-7。美国国家科学院院刊,Springer Berlin Heidelberg,265-284,。
 +
*辛西娅。2006.差分隐私,第33届国际自动机,语言和编程学术讨论会,第二部分(ICALP 2006) ,Springer Verlag,4052,1-12,。
 +
*Dwork,Cynthia and Aaron Roth.2014.差分隐私的算法基础。理论计算机科学的基础与发展趋势。第一卷。9,Nos.3–4.211–407, .
 +
*Machanavajjhala,Ashwin,Daniel Kifer,John m. Abowd,Johannes Gehrke,and Lars Vilhuber.2008.隐私权: 理论与实践的结合,国际数据工程会议2008:277-286,。
 +
*Dwork、 Cynthia 和 Moni Naor。2010.关于统计数据库中的披露防范的困难或者差分隐私的案例,隐私和保密期刊: 第一卷。2: Iss.1,第8条。网址:  http://repository.cmu.edu/jpc/vol2/iss1/8。
 +
*Kifer,Daniel and Ashwin Machanavajjhala.2011.数据隐私没有免费午餐。在2011年 ACM SIGMOD 国际数据管理会议记录(SIGMOD’11)。ACM,纽约,纽约,美国,193-204. 。
 +
*Erlingsson, Úlfar, Vasyl Pihur and Aleksandra Korolova.2014.RAPPOR: 随机可聚合隐私保护顺序响应。在2014年 ACM SIGSAC 计算机和通信安全会议(CCS’14)的会议记录中。ACM,纽约,纽约,美国,1054-1067。
 +
*以上,约翰 · m · 施穆特和伊恩 · m · 施穆特。2017 .重温隐私经济学: 人口统计和保密性保护作为公共产品。劳动动力学研究所,康奈尔大学,劳动动力学研究所,康奈尔大学, https://digitalcommons.ilr.cornell.edu/ldi/37/。即将到来。作为社会选择的隐私权保护与统计准确性的经济学分析。美国经济评论》 ,
 +
*苹果公司,2016。苹果预览 iOS 10,史上最大的 iOS 发布。新闻稿(六月十三日)。Https://www.apple.com/newsroom/2016/06/apple-previews-ios-10-biggest-ios-release-ever.html.
 +
*丁、博林、贾纳丹•库尔卡尼及谢尔盖•叶卡宁二○一七。私下收集遥测数据 NIPS 2017。
 +
*http://www.win-vector.com/blog/2015/10/a-simpler-explanation-of-differential-privacy/
 +
*Ryffel,Theo,Andrew Trask,et.艾尔。“一个保护隐私的通用深度学习框架”
 +
 +
==External links==
 +
*[https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/dwork.pdf Differential Privacy] by Cynthia Dwork, ICALP July 2006.
 +
*[http://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf The Algorithmic Foundations of Differential Privacy] by Cynthia Dwork and Aaron Roth, 2014.
 +
*[http://research.microsoft.com/apps/pubs/default.aspx?id=74339 Differential Privacy: A Survey of Results] by Cynthia Dwork, Microsoft Research, April 2008
 +
*[http://video.ias.edu/csdm/dynamicdata Privacy of Dynamic Data: Continual Observation and Pan Privacy] by Moni Naor, Institute for Advanced Study, November 2009
 +
*[http://simons.berkeley.edu/talks/katrina-ligett-2013-12-11 Tutorial on Differential Privacy] by [[Katrina Ligett]], California Institute of Technology, December 2013
 +
*[http://www.cerias.purdue.edu/news_and_events/events/security_seminar/details/index/j9cvs3as2h1qds1jrdqfdc3hu8 A Practical Beginner's Guide To Differential Privacy] by Christine Task, Purdue University, April 2012
 +
*[https://commondataproject.org/blog/2011/04/27/the-cdp-private-map-maker-v0-2/ Private Map Maker v0.2] on the Common Data Project blog
 +
*[https://research.googleblog.com/2014/10/learning-statistics-with-privacy-aided.html Learning Statistics with Privacy, aided by the Flip of a Coin] by Úlfar Erlingsson, Google Research Blog, October 2014
 +
*[https://www.belfercenter.org/publication/technology-factsheet-differential-privacy Technology Factsheet: Differential Privacy] by Raina Gandhi and Amritha Jayanti, Belfer Center for Science and International Affairs, Fall 2020
 +
 +
*Differential Privacy by Cynthia Dwork, ICALP July 2006.
 +
*The Algorithmic Foundations of Differential Privacy by Cynthia Dwork and Aaron Roth, 2014.
 +
*Differential Privacy: A Survey of Results by Cynthia Dwork, Microsoft Research, April 2008
 +
*Privacy of Dynamic Data: Continual Observation and Pan Privacy by Moni Naor, Institute for Advanced Study, November 2009
 +
*Tutorial on Differential Privacy by Katrina Ligett, California Institute of Technology, December 2013
 +
*A Practical Beginner's Guide To Differential Privacy by Christine Task, Purdue University, April 2012
 +
*Private Map Maker v0.2 on the Common Data Project blog
 +
*Learning Statistics with Privacy, aided by the Flip of a Coin by Úlfar Erlingsson, Google Research Blog, October 2014
 +
*Technology Factsheet: Differential Privacy by Raina Gandhi and Amritha Jayanti, Belfer Center for Science and International Affairs, Fall 2020
 +
 +
差分隐私: Cynthia Dwork,ICALP July 2006。差分隐私的算法基础》 ,Cynthia Dwork 和 Aaron Roth,2014年。2013年12月,加州理工学院卡特里娜 · 利格特教授,差分隐私,差分隐私,差分隐私实用指南,克里斯汀 · 特拉克,普渡大学,2012年4月
 +
*私人地图制作者 v0.2 on the Common Data Project Blog
 +
*Learning Statistics with Privacy,added by the Flip of a Coin by úlfar Erlingsson,Google Research Blog,October 2014
 +
*Technology Factsheet: 差分隐私地图制作者 Raina Gandhi and Amritha Jayanti,Belfer Center for Science and International Affairs,Fall 2020
 +
 +
[[Category:Theory of cryptography]]
 +
[[Category:Information privacy]]
 +
 +
 +
Category:Theory of cryptography
 +
Category:Information privacy
 +
 +
密码学理论范畴: 信息隐私
 +
 +
<noinclude>
 +
 +
<small>This page was moved from [[wikipedia:en:Differential privacy]]. Its edit history can be viewed at [[差分隐私/edithistory]]</small></noinclude>
 
{| class="wikitable" style="margin-left: auto; margin-right: auto; border: none;"
 
{| class="wikitable" style="margin-left: auto; margin-right: auto; border: none;"
 
|-
 
|-
23

个编辑

导航菜单