更改

删除1,291字节 、 2021年12月19日 (日) 11:46
无编辑摘要
第60行: 第60行:  
</center>
 
</center>
   −
其中概率取代了算法所使用的随机性,那么算法<math>\mathcal{A}</math>被认为实现了 <math>\epsilon</math>差分隐私。<ref name="DPBook" />
+
其中概率取代了算法所使用的随机性,那么算法<math>\mathcal{A}</math>被认为实现了 <math>\epsilon</math>差分隐私。<ref name="DPBook">
 +
[http://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf The Algorithmic Foundations of Differential Privacy] by Cynthia Dwork and Aaron Roth. Foundations and Trends in Theoretical Computer Science. Vol. 9, no. 3–4, pp.&nbsp;211‐407, Aug. 2014. </ref>
      第101行: 第102行:     
===敏感性===
 
===敏感性===
设 <math>d</math>为一正整数,<math>\mathcal{D}</math>为一数据集,<math>f \colon \mathcal{D} \rightarrow \mathbb{R}^d</math>是函数。函数的灵敏度<ref name="DMNS06" />记为<math>\Delta f</math>,定义为:
+
设 <math>d</math>为一正整数,<math>\mathcal{D}</math>为一数据集,<math>f \colon \mathcal{D} \rightarrow \mathbb{R}^d</math>是函数。函数的灵敏度<ref name="DMNS06">
 +
[https://link.springer.com/chapter/10.1007%2F11681878_14 Calibrating Noise to Sensitivity in Private Data Analysis] by Cynthia Dwork, Frank McSherry, Kobbi Nissim, Adam Smith. In Theory of Cryptography Conference (TCC), Springer, 2006. The [https://journalprivacyconfidentiality.org/index.php/jpc/article/view/405 full version] appears in Journal of Privacy and Confidentiality, 7 (3), 17-51. </ref>记为<math>\Delta f</math>,定义为:
    
<math>\Delta f=\max \lVert f(D_1)-f(D_2) \rVert_1,</math>
 
<math>\Delta f=\max \lVert f(D_1)-f(D_2) \rVert_1,</math>
第119行: 第121行:       −
其最多为 <math>e^{\frac{|f(D_{1})-f(D_{2})|}{\lambda}}\leq e^{\frac{\Delta(f)}{\lambda}}\,\!</math>。我们可以设<math>\frac{\Delta(f)}{\lambda}\,\!</math>为隐私因子<math>\epsilon\,\!</math>。因此,<math>\mathcal{T}\,\!</math>满足差分隐私机制的规则(从[[#&epsilon;-differential privacy|上面的定义]]可以看出)。如果尝试在我们的糖尿病例子中使用这个概念,那么从上述事实可以推出,为了使用<math>\mathcal{A}\,\!</math>作为<math>\epsilon\,\!</math>-差分隐私算法,我们需要满足<math>\lambda=1/\epsilon\,\!</math>。虽然我们在这里使用了拉普拉斯噪音,但是其他形式的噪音,如高斯噪音也是可使用的——但是可能需要稍微放宽差分隐私的定义。<ref name="Dwork, ICALP 2006" />
+
其最多为 <math>e^{\frac{|f(D_{1})-f(D_{2})|}{\lambda}}\leq e^{\frac{\Delta(f)}{\lambda}}\,\!</math>。我们可以设<math>\frac{\Delta(f)}{\lambda}\,\!</math>为隐私因子<math>\epsilon\,\!</math>。因此,<math>\mathcal{T}\,\!</math>满足差分隐私机制的规则(从[[#&epsilon;-differential privacy|上面的定义]]可以看出)。如果尝试在我们的糖尿病例子中使用这个概念,那么从上述事实可以推出,为了使用<math>\mathcal{A}\,\!</math>作为<math>\epsilon\,\!</math>-差分隐私算法,我们需要满足<math>\lambda=1/\epsilon\,\!</math>。虽然我们在这里使用了拉普拉斯噪音,但是其他形式的噪音,如高斯噪音也是可使用的——但是可能需要稍微放宽差分隐私的定义。<ref name="Dwork, ICALP 2006">
 +
[http://research.microsoft.com/pubs/64346/dwork.pdf Differential Privacy] by Cynthia Dwork, International Colloquium on Automata, Languages and Programming (ICALP) 2006, p.&nbsp;1–12. </ref>
      第178行: 第181行:     
===稳定转换===
 
===稳定转换===
对于任意两个数据库<math>A,B</math>,如果<math>T(A)</math>和 <math>T(B)</math>)之间的汉明距离最多是<math>A</math>和<math>B</math>之间的汉明距离Hamming Distance的<math>c</math>倍,则变换<math>T</math>是 <math>c</math>-稳定的。文章<ref name="PINQ" />中的定理2指出,如果存在一个机制<math>M</math>是<math>\epsilon</math>-差分隐私的,那么复合机制<math>M\circ T</math>也是<math>(\epsilon \times c)</math>-差分隐私的。
+
对于任意两个数据库<math>A,B</math>,如果<math>T(A)</math>和 <math>T(B)</math>)之间的汉明距离最多是<math>A</math>和<math>B</math>之间的汉明距离Hamming Distance的<math>c</math>倍,则变换<math>T</math>是 <math>c</math>-稳定的。文章<ref name="PINQ">
 +
[http://research.microsoft.com/pubs/80218/sigmod115-mcsherry.pdf Privacy integrated queries: an extensible platform for privacy-preserving data analysis] by Frank D. McSherry. In Proceedings of the 35th SIGMOD International Conference on Management of Data (SIGMOD), 2009. </ref>中的定理2指出,如果存在一个机制<math>M</math>是<math>\epsilon</math>-差分隐私的,那么复合机制<math>M\circ T</math>也是<math>(\epsilon \times c)</math>-差分隐私的。
      第186行: 第190行:  
==现实世界中差分隐私算法的应用==
 
==现实世界中差分隐私算法的应用==
 
在实践中,差分隐私的几个用途已经为人所知:  
 
在实践中,差分隐私的几个用途已经为人所知:  
*2008:美国人口普查局,用于显示通勤模式。<ref name="MachanavajjhalaKAGV08" />
+
*2008:美国人口普查局,用于显示通勤模式。<ref name="MachanavajjhalaKAGV08">
*2014:谷歌的 RAPPOR,用于遥测,例如了解不受欢迎的软件劫持用户设置的统计数据<ref name="RAPPOR" /><ref name=":12">{{Citation|title=google/rappor|date=2021-07-15|url=https://github.com/google/rappor|publisher=GitHub}}</ref>
+
Ashwin Machanavajjhala, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber. "Privacy: Theory meets Practice on the Map". In Proceedings of the 24th International Conference on Data Engineering, ICDE) 2008.</ref>
*2015:Google,用于分享历史流量统计数据。<ref name="Eland" />
+
*2014:谷歌的 RAPPOR,用于遥测,例如了解不受欢迎的软件劫持用户设置的统计数据<ref name="RAPPOR">
 +
Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova. [https://dl.acm.org/doi/10.1145/2660267.2660348 "RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response".] In Proceedings of the 21st ACM Conference on Computer and Communications Security (CCS), 2014.</ref><ref name=":12">{{Citation|title=google/rappor|date=2021-07-15|url=https://github.com/google/rappor|publisher=GitHub}}</ref>
 +
*2015:Google,用于分享历史流量统计数据。<ref name="Eland">[https://europe.googleblog.com/2015/11/tackling-urban-mobility-with-technology.html Tackling Urban Mobility with Technology] by Andrew Eland. Google Policy Europe Blog, Nov 18, 2015.</ref>
 
*2016:苹果公司宣布打算在iOS 10中使用差分隐私来改进其智能个人助理技术。<ref name=":13" />
 
*2016:苹果公司宣布打算在iOS 10中使用差分隐私来改进其智能个人助理技术。<ref name=":13" />
 
*2017:微软,用于Windows遥测系统。.<ref name=":13">{{cite web|title=Apple - Press Info - Apple Previews iOS 10, the Biggest iOS Release Ever|url=https://www.apple.com/pr/library/2016/06/13Apple-Previews-iOS-10-The-Biggest-iOS-Release-Ever.html|website=Apple|access-date=16 June 2016}}</ref>
 
*2017:微软,用于Windows遥测系统。.<ref name=":13">{{cite web|title=Apple - Press Info - Apple Previews iOS 10, the Biggest iOS Release Ever|url=https://www.apple.com/pr/library/2016/06/13Apple-Previews-iOS-10-The-Biggest-iOS-Release-Ever.html|website=Apple|access-date=16 June 2016}}</ref>
第213行: 第219行:     
==参考文献==
 
==参考文献==
{{Reflist|refs=
+
<references/>
<ref name="DKMMN06">
  −
Dwork, Cynthia, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. "Our data, ourselves: Privacy via distributed noise generation." In Advances in Cryptology-EUROCRYPT 2006, pp. 486–503. Springer Berlin Heidelberg, 2006.</ref>
  −
 
  −
<!-- unused refs  <ref name="CABP13">
  −
Chatzikokolakis, Konstantinos, Miguel E. Andrés, Nicolás Emilio Bordenabe, and Catuscia Palamidessi. "Broadening the scope of Differential Privacy using metrics." In Privacy Enhancing Technologies, pp. 82–102. Springer Berlin Heidelberg, 2013.</ref>
  −
 
  −
<ref name="HRW11">
  −
Hall, Rob, Alessandro Rinaldo, and Larry Wasserman. "Random differential privacy." arXiv preprint arXiv:1112.2680 (2011).</ref>-->
  −
 
  −
<ref name="MachanavajjhalaKAGV08">
  −
Ashwin Machanavajjhala, Daniel Kifer, John M. Abowd, Johannes Gehrke, and Lars Vilhuber. "Privacy: Theory meets Practice on the Map". In Proceedings of the 24th International Conference on Data Engineering, ICDE) 2008.</ref>
  −
 
  −
<ref name="RAPPOR">
  −
Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova. [https://dl.acm.org/doi/10.1145/2660267.2660348 "RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response".] In Proceedings of the 21st ACM Conference on Computer and Communications Security (CCS), 2014. {{doi|10.1145/2660267.2660348}}</ref>
  −
 
  −
<ref name="DMNS06">
  −
[https://link.springer.com/chapter/10.1007%2F11681878_14 Calibrating Noise to Sensitivity in Private Data Analysis] by Cynthia Dwork, Frank McSherry, Kobbi Nissim, Adam Smith. In Theory of Cryptography Conference (TCC), Springer, 2006. {{doi|10.1007/11681878_14}}. The [https://journalprivacyconfidentiality.org/index.php/jpc/article/view/405 full version] appears in Journal of Privacy and Confidentiality, 7 (3), 17-51. {{doi|10.29012/jpc.v7i3.405}}</ref>
  −
 
  −
<ref name="PINQ">
  −
[http://research.microsoft.com/pubs/80218/sigmod115-mcsherry.pdf Privacy integrated queries: an extensible platform for privacy-preserving data analysis] by Frank D. McSherry. In Proceedings of the 35th SIGMOD International Conference on Management of Data (SIGMOD), 2009. {{doi|10.1145/1559845.1559850}}</ref>
  −
 
  −
<ref name="Dwork, ICALP 2006">
  −
[http://research.microsoft.com/pubs/64346/dwork.pdf Differential Privacy] by Cynthia Dwork, International Colloquium on Automata, Languages and Programming (ICALP) 2006, p.&nbsp;1–12. {{doi|10.1007/11787006 1}}</ref>
  −
 
  −
<ref name="DPBook">
  −
[http://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf The Algorithmic Foundations of Differential Privacy] by Cynthia Dwork and Aaron Roth. Foundations and Trends in Theoretical Computer Science. Vol. 9, no. 3–4, pp.&nbsp;211‐407, Aug. 2014. {{doi|10.1561/0400000042}}</ref>
  −
 
  −
<ref name="Eland">[https://europe.googleblog.com/2015/11/tackling-urban-mobility-with-technology.html Tackling Urban Mobility with Technology] by Andrew Eland. Google Policy Europe Blog, Nov 18, 2015.</ref>
  −
 
  −
<ref name="DpWinTelemetry">[https://www.microsoft.com/en-us/research/publication/collecting-telemetry-data-privately/ Collecting telemetry data privately] by Bolin Ding, Jana Kulkarni, Sergey Yekhanin. NIPS 2017.</ref>
  −
 
      
<br>
 
<br>
7,129

个编辑