TY - GEN
T1 - Securing Decision Tree Inference Using Order-Preserving Cryptography
AU - Karn, Rupesh Raj
AU - Nawaz, Kashif
AU - Elfadel, Ibrahim Abe M.
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In machine learning (ML) inference, two parties, Alice and Bob, are engaged in a transaction where Alice is the owner of a decision tree model but does not want to reveal its parameters to Bob, who has private data. Bob wants to use Alice's model for inference, but does not want to reveal his data. Knowing the heavy computational cost of fully homomorphic encryption, Alice and Bob agree to use order-preserving encryption (OPE) for running the inference engine in full confidence without revealing either the decision tree model or the private data. In this paper, we describe how such an OPE computation is established between Alice and Bob. Specifically, we demonstrate how to secure full confidentiality in decision tree inference on an FPGA accelerator embodying an OPE protocol. A finite-state machine design of the encrypted decision tree is evaluated for throughput and resource utilization on an Intel Cyclone V FPGA using the MNIST dataset.
AB - In machine learning (ML) inference, two parties, Alice and Bob, are engaged in a transaction where Alice is the owner of a decision tree model but does not want to reveal its parameters to Bob, who has private data. Bob wants to use Alice's model for inference, but does not want to reveal his data. Knowing the heavy computational cost of fully homomorphic encryption, Alice and Bob agree to use order-preserving encryption (OPE) for running the inference engine in full confidence without revealing either the decision tree model or the private data. In this paper, we describe how such an OPE computation is established between Alice and Bob. Specifically, we demonstrate how to secure full confidentiality in decision tree inference on an FPGA accelerator embodying an OPE protocol. A finite-state machine design of the encrypted decision tree is evaluated for throughput and resource utilization on an Intel Cyclone V FPGA using the MNIST dataset.
UR - https://www.scopus.com/pages/publications/85166374802
U2 - 10.1109/AICAS57966.2023.10168588
DO - 10.1109/AICAS57966.2023.10168588
M3 - Conference contribution
AN - SCOPUS:85166374802
T3 - AICAS 2023 - IEEE International Conference on Artificial Intelligence Circuits and Systems, Proceeding
BT - AICAS 2023 - IEEE International Conference on Artificial Intelligence Circuits and Systems, Proceeding
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2023
Y2 - 11 June 2023 through 13 June 2023
ER -