Sciweavers

POPL
2012
ACM

Probabilistic relational reasoning for differential privacy

12 years 5 months ago
Probabilistic relational reasoning for differential privacy
Differential privacy is a notion of confidentiality that protects the privacy of individuals while allowing useful computations on their private data. Deriving differential privacy guarantees for real programs is a difficult and error-prone task that calls for principled approaches and tool support. Approaches based on linear types and static analysis have recently emerged; however, an increasing number of programs achieve privacy using techniques that cannot be analyzed by these approaches. Examples include programs that aim for weaker, approximate differential privacy guarantees, programs that use the Exponential mechanism, and randomized programs that achieve differential privacy without using any standard mechanism. Providing support for reasoning about the privacy of such programs has been an open problem. We report on CertiPriv, a machine-checked framework for reasoning about differential privacy built on top of the Coq proof assistant. The central component of CertiPriv is a ...
Gilles Barthe, Boris Köpf, Federico Olmedo, S
Added 25 Apr 2012
Updated 25 Apr 2012
Type Journal
Year 2012
Where POPL
Authors Gilles Barthe, Boris Köpf, Federico Olmedo, Santiago Zanella Béguelin
Comments (0)