earticle

논문검색

Session AI and Data Analysis Ⅱ

Acceleration of Secure Activation Function for Privacy-preserving Neural Network

초록

영어

Neural networks are increasingly being used in cloud-based applications, which require users to upload their sensitive data to the cloud server. However, the data privacy may be compromised when the server trains or infers a neural network model using the plaintext data. To address this privacy issue, many studies have developed privacy-preserving neural networks. Recently, FENet, a privacy-preserving neural networks framework using functional encryption, was proposed by Panzade and Takabi. In this paper, we propose a method to accelerate the secure activation function of FENet. We adopt a precomputation approach to reduce the computational overhead of privacy-preserving matrix multiplication, which is the dominant operation in the secure activation function of FENet. According to our performance analysis, the privacypreserving matrix multiplication can be performed by 3.77 times faster than that of FENet with additional 3.49 MB of memory. Since the secure activation function of FENet can be applied to both the training and inference phases, the proposed method is expected to accelerate both phases.

목차

Abstract
I. INTRODUCTION
II. PRELIMINARIES
A. Function-Hiding Inner Product Encryption (FHIPE)
B. Privacy-Preserving Matrix Multiplication using 𝛱!&quat;#
III. EXISTING METHOD: FENET
IV. PROPOSED METHOD
V. CONCLUSION
ACKNOWLEDGMENT
REFERENCES

저자정보

  • Seong-Yun Jeon Department of Computer Engineering Inha University
  • Hee-Yong Kwon Department of Computer Engineering Inha University
  • Mun-Kyu Lee Department of Computer Engineering Inha University

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      0개의 논문이 장바구니에 담겼습니다.