Fed-CAD: Federated Learning with Correlation-aware Adaptive Local Differential Privacy

Bingzhu Zhu, Shan Chang, Guanghao Liang, Hongzi Zhu and Jie Xu

in Proceedings of IEEE/ACM IWQoS 2024, Guangzhou, China. (Best Student Paper Award)

Federated Learning (FL) enables multiple participants to collaboratively train a globally shared model without the need of explicit data sharing. However, prior research indicates that local model updates released during the federated training may also jeopardize privacy of participants. To address this issue, local differential privacy (LDP) mechanism has been applied to FL systems. LDP provides privacy protection with rigorous mathematical proof by introducing random perturbations, e.g., Gaussian noise, to the released updates, however excessive noise compromises the utility of the updates. In this paper, we propose a novel Correlation-aware Adaptive LDP mechanism, Fed-CAD, for FL, which reduces the required scale of noise by leveraging the temporal correlation between consecutive local model updates belonging to the same participant, without increasing the privacy budgets (risks). We theoretical prove that Fed-CAD satisfies (ϵ, σ2)-LDP as long as the difference between local models is smaller than the differential bound, and analyze the noise variance, a metric of utility. We implement Fed-CAD on image classification FL tasks. Experimental results demonstrate that Fed-CAD significantly outperforms the one-shot LDP baseline.

PDF

Page View: 258