SHAPER: A General Architecture for Privacy-Preserving Primitives in Secure Machine Learning.

Ziyuan Liang, Qi'ao Jin, Zhiyong Wang, Zhaohui Chen,Zhen Gu, Yanheng Lu,Fan Zhang

IACR Trans. Cryptogr. Hardw. Embed. Syst.(2024)

引用 0|浏览1
暂无评分
摘要
Secure multi-party computation and homomorphic encryption are two primary security primitives in privacy-preserving machine learning, whose wide adoption is, nevertheless, constrained by the computation and network communication overheads. This paper proposes a hybrid Secret-sharing and Homomorphic encryption Architecture for Privacy-pERsevering machine learning (SHAPER). SHAPER protects sensitive data in encrypted or randomly shared domains instead of relying on a trusted third party. The proposed algorithm-protocol-hardware co-design methodology explores techniques such as plaintext Single Instruction Multiple Data (SIMD) and fine-grained scheduling, to minimize end-to-end latency in various network settings. SHAPER also supports secure domain computing acceleration and the conversion between mainstream privacy-preserving primitives, making it ready for general and distinctive data characteristics. SHAPER is evaluated by FPGA prototyping with a comprehensive hyper-parameter exploration, demonstrating a 94x speed-up over CPU clusters on large-scale logistic regression training tasks.
更多
查看译文
关键词
Privacy-Preserving Machine Learning,Multi-Party Computation,Additive Homomorphic Encryption,Hardware Accelerator
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要