Full Text:  <417>

CLC number: 

On-line Access: 2024-12-30

Received: 2024-05-08

Revision Accepted: 2024-11-21

Crosschecked: 0000-00-00

Cited: 0

Clicked: 456

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering 

Accepted manuscript available online (unedited version)


Efficient privacy-preserving scheme for secure neural network inference


Author(s):  Liquan CHEN, Zixuan YANG, Peng ZHANG, Yang MA

Affiliation(s):  School of Cyber Science and Engineering, Southeast University, Nanjing 210096, China; more

Corresponding email(s):  Lqchen@seu.edu.cn

Key Words:  Secure neural network inference; Convolutional neural network; Privacy-preserving; Homomorphic encryption; Secret sharing


Share this article to: More <<< Previous Paper|Next Paper >>>

Liquan CHEN, Zixuan YANG, Peng ZHANG, Yang MA. Efficient privacy-preserving scheme for secure neural network inference[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2400371

@article{title="Efficient privacy-preserving scheme for secure neural network inference",
author="Liquan CHEN, Zixuan YANG, Peng ZHANG, Yang MA",
journal="Frontiers of Information Technology & Electronic Engineering",
year="in press",
publisher="Zhejiang University Press & Springer",
doi="https://doi.org/10.1631/FITEE.2400371"
}

%0 Journal Article
%T Efficient privacy-preserving scheme for secure neural network inference
%A Liquan CHEN
%A Zixuan YANG
%A Peng ZHANG
%A Yang MA
%J Frontiers of Information Technology & Electronic Engineering
%P
%@ 2095-9184
%D in press
%I Zhejiang University Press & Springer
doi="https://doi.org/10.1631/FITEE.2400371"

TY - JOUR
T1 - Efficient privacy-preserving scheme for secure neural network inference
A1 - Liquan CHEN
A1 - Zixuan YANG
A1 - Peng ZHANG
A1 - Yang MA
J0 - Frontiers of Information Technology & Electronic Engineering
SP -
EP -
%@ 2095-9184
Y1 - in press
PB - Zhejiang University Press & Springer
ER -
doi="https://doi.org/10.1631/FITEE.2400371"


Abstract: 
The increasing adoption of smart devices and cloud services, coupled with limitations in local computing and storage resources, prompts extensive users to transmit private data to cloud servers for processing. However, the transmission of sensitive data in plaintext form raises concerns regarding user's privacy and security. To address these issues, this study proposes an efficient privacy-preserving secure neural network inference scheme based on homomorphic encryption and secure multi-party computation, which ensures the privacy of both the user and the cloud server while enabling fast and accurate ciphertext inference. First, we divided the inference process into three stages, including the merging stage for adjusting the network structure, the preprocessing stage for performing homomorphic computations, and the online stage for floating-point operations on the additive secret sharing of private data. Second, we proposed an approach of merging network parameters, thereby reducing the cost of multiplication levels and decreasing both ciphertext-plaintext multiplication and addition operations. Finally, we proposed a fast convolution algorithm to enhance computational efficiency. Compared with other literature, our scheme reduces linear operation time in the online stage by at least 11%, significantly reducing inference time and communication overhead.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2025 Journal of Zhejiang University-SCIENCE