January 2025
·
5 Reads
IEEE Transactions on Information Forensics and Security
Federated graph attention networks (FGATs) are gaining prominence for enabling collaborative and privacy-preserving graph model training. The attention mechanisms in FGATs enhance the focus on crucial graph features for improved graph representation learning while maintaining data decentralization. However, these mechanisms inherently process sensitive information, which is vulnerable to privacy threats like graph reconstruction and attribute inference. Additionally, their role in assigning varying and changing importance to nodes challenges traditional privacy methods to balance privacy and utility across varied node sensitivities effectively. Our study fills this gap by proposing an efficient privacy-preserving FGAT (PFGAT). We present an attention-based dynamic differential privacy (DP) approach via an improved multiplication triplet (IMT). Specifically, we first propose an IMT mechanism that leverages a reusable triplet generation method to efficiently and securely compute the attention mechanism. Second, we employ an attention-based privacy budget that dynamically adjusts privacy levels according to node data significance, optimizing the privacy-utility trade-off. Third, the proposed hybrid neighbor aggregation algorithm tailors DP mechanisms according to the unique characteristics of neighbor nodes, thereby mitigating the adverse impact of DP on graph attention network (GAT) utility. Extensive experiments on benchmarking datasets confirm that PFGAT maintains high efficiency and ensures robust privacy protection against potential attacks.