TY - GEN
T1 - Confidential Inference in Decision Trees
AU - Karn, Rupesh Raj
AU - Gebremichael, Mizan
AU - Nawaz, Kashif
AU - Elfadel, Ibrahim M.
N1 - Publisher Copyright:
© IFIP International Federation for Information Processing 2024.
PY - 2024
Y1 - 2024
N2 - In confidential computing, arithmetic algorithms operate on encrypted inputs to produce encrypted outputs. Specifically, in confidential inference, Alice has the parameters of the machine-learning model but does not want to reveal them to Bob, who has the data. Bob wants to use Alice’s model for inference, but does not want to reveal his data. Alice and Bob agree to use confidential computing to run the inference engine without revealing either the model or the data. However, they find that fully homomorphic and order-preserving encryptions are very time-consuming and very challenging to accelerate on hardware. When the machine learning model is a decision tree, these encryptions can be made computationally efficient and can even be readily accelerated on hardware. In this paper, we reveal how Alice and Bob run the inference engine of a decision tree in full confidence and show FPGA implementations of additively homomorphic, order-preserving, and post-quantum order-preserving encryption on constrained hardware platforms. We further evaluate the resources needed to implement the ciphertext decision tree and compare them with those of a plaintext decision tree. Confidential inference tests are run on the encrypted FPGA design using the MNIST data set.
AB - In confidential computing, arithmetic algorithms operate on encrypted inputs to produce encrypted outputs. Specifically, in confidential inference, Alice has the parameters of the machine-learning model but does not want to reveal them to Bob, who has the data. Bob wants to use Alice’s model for inference, but does not want to reveal his data. Alice and Bob agree to use confidential computing to run the inference engine without revealing either the model or the data. However, they find that fully homomorphic and order-preserving encryptions are very time-consuming and very challenging to accelerate on hardware. When the machine learning model is a decision tree, these encryptions can be made computationally efficient and can even be readily accelerated on hardware. In this paper, we reveal how Alice and Bob run the inference engine of a decision tree in full confidence and show FPGA implementations of additively homomorphic, order-preserving, and post-quantum order-preserving encryption on constrained hardware platforms. We further evaluate the resources needed to implement the ciphertext decision tree and compare them with those of a plaintext decision tree. Confidential inference tests are run on the encrypted FPGA design using the MNIST data set.
KW - Combinational Circuit
KW - Finite-State Machine
KW - FPGA Implementation
KW - Homomorphic Encryption
KW - Order-Preserving Encryption
KW - Post-Quantum Cryptosystem
KW - Sequential Circuit
UR - https://www.scopus.com/pages/publications/85219187195
U2 - 10.1007/978-3-031-70947-0_14
DO - 10.1007/978-3-031-70947-0_14
M3 - Conference contribution
AN - SCOPUS:85219187195
SN - 9783031709463
T3 - IFIP Advances in Information and Communication Technology
SP - 273
EP - 297
BT - VLSI-SoC 2023
A2 - Elfadel, Ibrahim (Abe) M.
A2 - Albasha, Lutfi
PB - Springer Science and Business Media Deutschland GmbH
T2 - 31st IFIP WG 10.5/IEEE International Conference on Very Large Scale Integration - System on a Chip, VLSI-SoC 2023
Y2 - 16 October 2023 through 18 October 2023
ER -