U.S. flag   An unofficial archive of your favorite United States government website
Dot gov

Official websites do not use .rip
We are an unofficial archive, replace .rip by .gov in the URL to access the official website. Access our document index here.

Https

We are building a provable archive!
A lock (Dot gov) or https:// don't prove our archive is authentic, only that you securely accessed it. Note that we are working to fix that :)

Special Topics on Privacy And Public Auditability — Event 1

STPPA Event #1:

Date: Monday, January 27, 2020.

Place: NIST Gaithersburg, Administrative Building (101), Lecture room B.

Featured topics: public randomness and auditability; differential privacy; census data; fake videos.

Structure: Four talks related to privacy and cryptography.

  • 10:00–10:15: Introductory remarks. René Peralta (NIST)
  • 10:15–10:45: Randomness beacons as enablers of public auditability. Luís Brandão (NIST). Slides and video.
  • 10:45–11:30:* De-Identification and Differential Privacy. Simson Garfinkel (U.S. Census Bureau). Slides and video.
  • 11:30–11:45: Break
  • 11:45–12:30:** Differential privacy and the 2020 Census. Simson Garfinkel (U.S. Census Bureau). Slides and video.
  • 12:30–14:00: Lunch
  • 14:00–15:00: What math and physics can do to combat fake videos. Charles Bennett (IBM). Video.
  • 15:00–15:30: Closing remarks

Notes:

* The initially shared agenda mentioned a different 2nd talk (10:45am–11:30am), but the speaker could not attend; the actual (and impromptu) 2nd talk was given by Simson Garfield from the U.S. Census Bureau.

** A previous agenda mentioned the title "Differential Privacy at the US Census Bureau:Status Report" for the 3rd talk (11:45--12:30), but the speaker adapted the presentation and showed a different slide-deck.

The schedule and abstracts are also available in a PDF document.

About STPPA: The "Special Topics on Privacy and Public Auditbaility" series hosts talks on various interconnected topics related to privacy and public auditability. The goal is to convey basic technical background, incite curiosity, suggest research questions and discuss applications. 

Selected Presentations
January 27, 2020 Type
10:15 AM Randomness Beacons as Enablers of Public Auditability
Luís Brandão - NIST

Randomness Beacons as Enablers of Public Auditability

Presentation
10:45 AM De-Identification and Differential Privacy
Simson Garfinkel - U.S. Census Bureau

Talk on De-Identification and Differential Privacy

Presentation
11:45 AM Deploying differential privacy for the 2020 census of population and housing
Simson Garfinkel - U.S. Census Bureau

The privacy risks associated with the publication of official statistics have increased significantly over the last decade, fueled by the proliferation of detailed, third-party data sources and technological advances that make re-identification of individuals in public data releases increasingly easy. This presentation will discuss the U.S. Census Bureau’s research into these emerging privacy threats, and the agency’s efforts to modernize its disclosure avoidance methods using differential privacy to ensure the confidentiality of individuals’ data. Differential privacy offers significant advantages over traditional disclosure avoidance methods for safeguarding individuals’ privacy, but the implementation of differential privacy at scale for the 2020 Decennial Census has also posed a number of challenges. This presentation will explore these challenges and discuss the lessons learned from this initiative.

* Note: the presentation started with a vote to decide which slide-deck to cover: "Differential Privacy at the US Census Bureau: Status Report" (previously advertised title) or "Deploying differential privacy for the 2020 census of population and housing" (this was selected).

 

Presentation
2:00 PM What math and physics can do to combat fake videos
Charles Bennet - IBM

Abstract: Progress in artificial intelligence has made it easy to produce “Deep Fake” videos that are so realistic that even experts have trouble identifying them, and go on to spread virally, due to people’s susceptibility to content that appeals to their prejudices or fears, especially when forwarded by friends with whom they correspond regularly. It would seem that the hard sciences can do little to mitigate this problem, which has so much to do with psychology and human nature. But math and physics can be a significant part of the solution, by establishing in a hard-to-fake way a video’s time and place of origin, and that it has not been subsequently altered. An ordinary smartphone, if it is internet-connected, can be used to make rather hard-to-fake videos, and with the help of public randomness beacons, very hard-to-fake ones whose authenticity can be verified without needing to trust either the maker of the video or any centralized authority. A more serious problem is content that spreads virally despite containing no evidence at all of its provenance. Trusted open-source client-side scanning software and differential privacy techniques may offer a way to flag rapidly-spreading items for subsequent fact-checking without seriously compromising social media users’ privacy or freedom of speech.

Presentation

Event Details

Starts: January 27, 2020 - 10:00 AM EST
Ends: January 27, 2020 - 03:00 PM EST
<p>Monday, January 27, 2020 @ <a href="https://www.nist.gov/about-nist/visit">NIST Gaithersburg</a>, Administrative Building (101), Lecture room B</p>

Format: In-person Type: Other

Attendance Type: Open to public
Audience Type: Industry,Government,Academia,Other

Parent Project

See: Privacy-Enhancing Cryptography

Related Topics

Security and Privacy: cryptography, privacy

Created April 05, 2021, Updated June 30, 2021