Your browser doesn't support javascript.
loading
SecBERT: Privacy-preserving pre-training based neural network inference system.
Huang, Hai; Wang, Yongjian.
Afiliação
  • Huang H; Computer School, Zhejiang Sci-Tech University, Hangzhou, 310018, China. Electronic address: haihuang1005@gmail.com.
  • Wang Y; Computer School, Zhejiang Sci-Tech University, Hangzhou, 310018, China.
Neural Netw ; 172: 106135, 2024 Apr.
Article em En | MEDLINE | ID: mdl-38271920
ABSTRACT
Pre-trained models such as BERT have made great achievements in natural language processing tasks in recent years. In this paper, we investigate the privacy-preserving pre-training based neural network inference in a two-server framework based on additive secret sharing technique. Our protocol allows a resource-restrained client to request two powerful servers to cooperatively process the natural processing tasks without revealing any useful information about its data. We first design a series of secure sub-protocols for non-linear functions used in BERT model. These sub-protocols are expected to have broad applications and of independent interest. Based on the building sub-protocols, we propose SecBERT, a privacy-preserving pre-training based neural network inference protocol. SecBERT is the first cryptographically secure privacy-preserving pre-training based neural network inference protocol. We show security, efficiency and accuracy of SecBERT protocol through comprehensive theoretical analysis and experiments.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Segurança Computacional / Privacidade Limite: Humans Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Segurança Computacional / Privacidade Limite: Humans Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2024 Tipo de documento: Article