Date Published: March 2025
Author(s)
Joseph Near (University of Vermont), David Darais (Galois), Naomi Lefkovitz (NIST), Gary Howarth (NIST)
This publication describes differential privacy — a mathematical framework that quantifies privacy loss to entities when their data appears in a dataset. The primary goal of this publication is to help practitioners of all backgrounds better understand how to think about differentially private software solutions. Multiple factors for consideration are identified in a differential privacy pyramid along with several privacy hazards, which are common pitfalls that arise as the mathematical framework of differential privacy is realized in practice.
This publication describes differential privacy — a mathematical framework that quantifies privacy loss to entities when their data appears in a dataset. The primary goal of this publication is to help practitioners of all backgrounds better understand how to think about differentially private...
See full abstract
This publication describes differential privacy — a mathematical framework that quantifies privacy loss to entities when their data appears in a dataset. The primary goal of this publication is to help practitioners of all backgrounds better understand how to think about differentially private software solutions. Multiple factors for consideration are identified in a differential privacy pyramid along with several privacy hazards, which are common pitfalls that arise as the mathematical framework of differential privacy is realized in practice.
Hide full abstract
Keywords
anonymization; data analytics; data privacy; de-identification; differential privacy; privacy; privacy-enhancing technologies
Control Families
None selected