返回

STEM与日常科技·英语30篇(5)

4 / 30
正在校验访问权限...
Differential Privacy: Sharing Data Without Exposing People

Differential Privacy: Sharing Data Without Exposing People

差分隐私:共享数据而不暴露个人

  1. Governments and hospitals often publish statistics—like disease rates or income averages—to guide policy.
  2. But raw data releases risk re-identifying individuals through cross-referencing with other public sources.
  3. Differential privacy solves this by adding carefully calibrated mathematical noise to each query result.
  4. For example, when counting infected patients in a town, the system might add ±3 random people.
  5. This small uncertainty hides any single person’s status while preserving overall trends.
  6. Crucially, attackers cannot tell whether a specific individual was included in the dataset.
  7. The privacy guarantee holds even if they know everything else about everyone else.
  8. Apple and Google use it to collect typing habits and location patterns without storing personal logs.
  9. Regulators now require differential privacy for many public health and census datasets.
  10. It proves strong privacy and useful data can coexist—if designed with math, not just good intentions.

试读结束

该书不支持试读,请购买后阅读完整内容

点击购买 ¥29.9
上一页
/ 30
下一页