Loading web-font TeX/Main/Regular
Optimizing Privacy-Preserving Outsourced Convolutional Neural Network Predictions | IEEE Journals & Magazine | IEEE Xplore

Optimizing Privacy-Preserving Outsourced Convolutional Neural Network Predictions


Abstract:

Convolutional neural networks (CNN) is a popular architecture in machine learning for its predictive power, notably in computer vision and medical image analysis. Its gre...Show More

Abstract:

Convolutional neural networks (CNN) is a popular architecture in machine learning for its predictive power, notably in computer vision and medical image analysis. Its great predictive power requires extensive computation, which encourages model owners to host the prediction service in a cloud platform. This article proposes a CNN prediction scheme that preserves privacy in the outsourced setting, i.e., the model-hosting server cannot learn the query, (intermediate) results, and the model. Similar to SecureML (S&P’17), a representative work that provides model privacy, we employ two non-colluding servers with secret sharing and triplet generation to minimize the usage of heavyweight cryptography. We made the following optimizations for both overall latency and accuracy. 1) We adopt asynchronous computation and SIMD for offline triplet generation and parallelizable online computation. 2) As MiniONN (CCS’17) and its improvement by the generic EzPC compiler (EuroS&P’19), we use a garbled circuit for the non-polynomial ReLU activation to keep the same accuracy as the underlying network (instead of approximating it in SecureML prediction). 3) For the pooling in CNN, we employ (linear) average-pooling, which achieves almost the same accuracy as the (non-linear, and hence less efficient) max-pooling exhibited by MiniONN and EzPC. Considering both offline and online costs, our experiments on the MNIST dataset show a latency reduction of 122\times, 14.63\times, and 36.69\times compared to SecureML, MiniONN, and EzPC; and a reduction of communication costs by 1.09\times, 36.69\times, and 31.32\times, respectively. On the CIFAR dataset, our scheme achieves a lower latency by 7.14\times and 3.48\times and lower communication costs by 13.88\times and 77.46\times when compared with MiniONN and EzPC, respectively.
Published in: IEEE Transactions on Dependable and Secure Computing ( Volume: 19, Issue: 3, 01 May-June 2022)
Page(s): 1592 - 1604
Date of Publication: 09 October 2020

ISSN Information:

Funding Agency:


1 Introduction

Machine learning (ML) [1] performs well in many applications and has been widely used. Neural networks, which identify relationships underlying a set of data by mimicking how the human brain operates, have recently gained extensive attention (e.g., [2], [3], [4]). Inspired by multi-layer perceptrons, convolutional neural networks (CNN) are proven to be useful in medical image analysis and recognition of images and videos.

Contact IEEE to Subscribe

References

References is not available for this document.