Date: Monday, January 27, 2020.
Place: NIST Gaithersburg, Administrative Building (101), Lecture room B.
Featured topics: public randomness and auditability; differential privacy; census data; fake videos.
Structure: Four talks related to privacy and cryptography.
Notes:
* The initially shared agenda mentioned a different 2nd talk (10:45am–11:30am), but the speaker could not attend; the actual (and impromptu) 2nd talk was given by Simson Garfield from the U.S. Census Bureau.
** A previous agenda mentioned the title "Differential Privacy at the US Census Bureau:Status Report" for the 3rd talk (11:45--12:30), but the speaker adapted the presentation and showed a different slide-deck.
The schedule and abstracts are also available in a PDF document.
About STPPA: The "Special Topics on Privacy and Public Auditbaility" series hosts talks on various interconnected topics related to privacy and public auditability. The goal is to convey basic technical background, incite curiosity, suggest research questions and discuss applications.
Selected Presentations | |
---|---|
January 27, 2020 | Type |
10:15 AM
Randomness Beacons as Enablers of Public Auditability Luís Brandão - NIST Randomness Beacons as Enablers of Public Auditability |
Presentation |
10:45 AM
De-Identification and Differential Privacy Simson Garfinkel - U.S. Census Bureau Talk on De-Identification and Differential Privacy |
Presentation |
11:45 AM
Deploying differential privacy for the 2020 census of population and housing Simson Garfinkel - U.S. Census Bureau The privacy risks associated with the publication of official statistics have increased significantly over the last decade, fueled by the proliferation of detailed, third-party data sources and technological advances that make re-identification of individuals in public data releases increasingly easy. This presentation will discuss the U.S. Census Bureau’s research into these emerging privacy threats, and the agency’s efforts to modernize its disclosure avoidance methods using differential privacy to ensure the confidentiality of individuals’ data. Differential privacy offers significant advantages over traditional disclosure avoidance methods for safeguarding individuals’ privacy, but the implementation of differential privacy at scale for the 2020 Decennial Census has also posed a number of challenges. This presentation will explore these challenges and discuss the lessons learned from this initiative. * Note: the presentation started with a vote to decide which slide-deck to cover: "Differential Privacy at the US Census Bureau: Status Report" (previously advertised title) or "Deploying differential privacy for the 2020 census of population and housing" (this was selected).
|
Presentation |
2:00 PM
What math and physics can do to combat fake videos Charles Bennet - IBM Abstract: Progress in artificial intelligence has made it easy to produce “Deep Fake” videos that are so realistic that even experts have trouble identifying them, and go on to spread virally, due to people’s susceptibility to content that appeals to their prejudices or fears, especially when forwarded by friends with whom they correspond regularly. It would seem that the hard sciences can do little to mitigate this problem, which has so much to do with psychology and human nature. But math and physics can be a significant part of the solution, by establishing in a hard-to-fake way a video’s time and place of origin, and that it has not been subsequently altered. An ordinary smartphone, if it is internet-connected, can be used to make rather hard-to-fake videos, and with the help of public randomness beacons, very hard-to-fake ones whose authenticity can be verified without needing to trust either the maker of the video or any centralized authority. A more serious problem is content that spreads virally despite containing no evidence at all of its provenance. Trusted open-source client-side scanning software and differential privacy techniques may offer a way to flag rapidly-spreading items for subsequent fact-checking without seriously compromising social media users’ privacy or freedom of speech. |
Presentation |
Starts: January 27, 2020 - 10:00 AM EST
Ends: January 27, 2020 - 03:00 PM EST
<p>Monday, January 27, 2020 @ <a href="https://www.nist.gov/about-nist/visit">NIST Gaithersburg</a>, Administrative Building (101), Lecture room B</p>
Format: In-person Type: Other
Attendance Type: Open to public
Audience Type: Industry,Government,Academia,Other
Security and Privacy: cryptography, privacy