STEM与日常科技·英语30篇(5)
4 / 30
正在校验访问权限...
Differential Privacy: Sharing Data Without Exposing People
差分隐私:共享数据而不暴露个人
-
Governments and hospitals often publish statistics—like disease rates or income averages—to guide policy.
-
But raw data releases risk re-identifying individuals through cross-referencing with other public sources.
-
Differential privacy solves this by adding carefully calibrated mathematical noise to each query result.
-
For example, when counting infected patients in a town, the system might add ±3 random people.
-
This small uncertainty hides any single person’s status while preserving overall trends.
-
Crucially, attackers cannot tell whether a specific individual was included in the dataset.
-
The privacy guarantee holds even if they know everything else about everyone else.
-
Apple and Google use it to collect typing habits and location patterns without storing personal logs.
-
Regulators now require differential privacy for many public health and census datasets.
-
It proves strong privacy and useful data can coexist—if designed with math, not just good intentions.