Official websites do not use .rip
A .gov website belongs to an official government organization in the United States.

We are building a provable archive!
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Presentation

WPEC 2024 Talk 2b3: Practical performance of CKKS and encrypted training and inference for classification

September 25, 2024

Presenters

Junbum Shin - CryptoLab, South Korea
Damien Stehlé - CryptoLab, France

Description

Fully Homomorphic Encryption (FHE) is one of the core technologies in Privacy Enhancing Cryptography. Its applicability encompasses a broad range of functionalities (PSI, PIR, privacy-preserving AI,  threshold cryptography, etc). Unlike hardware-based solutions like Trusted Execution Environments (TEEs), which have larger attack surfaces, FHE offers cryptographic security with a smaller attack surface. However, it is sometimes discarded for being computationally too heavy for practical deployment. In this presentation, we will first highlight the concrete performance of the CKKS FHE scheme (Cheon, Kim, Kim and Song, Asiacrypt ‘17), when implemented in central and graphical processor units (CPU and GPU). CKKS natively enables approximate computations on complex and real numbers and may also be used for exact computations (Drucker, Moshkowitz, Pelleg, Shaul; J. Cryptol. ‘24). The strong performance of CKKS enables practical solutions for numerous privacy-preserving applications, such as privacy-preserving AI. It also makes it possible to homomorphically evaluate massive circuits, such as those occurring in large language model inference. We will then focus on the FHE-based approach for privacy-preserving AI outsourcing, focusing on image and text classification. AI services offered by cloud providers make AI accessible by automating model training, but raise privacy concerns since sensitive data is handled on remote servers. For practical performance, we leverage public transformer encoders—such as Vision Transformer for images and BERT, MPNET, and E5 for text. Instead of applying homomorphic encryption to the entire model, we protect only the features extracted by open source transformers. This approach accelerates both training and inference dramatically. As an example, we showcase one application of classification of vehicles. Using FHE-based Vision Transformer takes about 4 minutes for training and 0.2 seconds for inference, demonstrating the method’s practicality. A live demo using AutoFHE (https://autofhe.com) will be shown in the presentation.

[Slides]

Presented at

WPEC 2024: NIST Workshop on Privacy-Enhancing Cryptography 2024. Virtual, 2024-Sep-24–26.

Event Details

Location

    Virtual

Related Topics

Security and Privacy: cryptography

Created September 19, 2024, Updated September 26, 2024