Welcome to Francis Academic Press

Academic Journal of Computing & Information Science, 2023, 6(11); doi: 10.25236/AJCIS.2023.061113.

Personalized Federated Learning with Attention Mechanisms

Author(s)

Siyuan Zhang, Guiquan Liu

Corresponding Author:
Siyuan Zhang
Affiliation(s)

School of Cyber Security, University of Science and Technology of China, Hefei, China

Abstract

With the rapid development of science and technology, machine learning and deep learning technology have more and more applications in our daily life. At the same time, people pay more and more attention to the protection of their own privacy. And in recent years, various countries have introduced a series of laws and regulations to protect people's data privacy. In this case, the traditional method of pooling the data of each user for model training is no longer applicable, and the data of each user can only be saved in the hands of the user himself, which has a profound impact on the development of artificial intelligence technology. To solve this dilemma, the industry has proposed the concept of federated learning. Due to the data imbalance problem between each user, the prediction ability of the global aggregation model obtained by federated learning on the local specific data of each user needs to be improved. Therefore, personalization is an issue that federated learning needs to pay attention to.In order to improve the personalization ability, this paper proposes a federated transfer learning algorithm with attention mechanism. After each user obtains the global aggregation model of federated learning, the attention module is added to the local model, and then the parameters of the low-level neurons are frozen during training, and only the parameters of the high-level neurons and the attention module are updated. Finally, each user obtains a unique model that is more suitable for local data. In this paper, we conduct experiments to analyze the performance of this algorithm and the federated transfer algorithm and federated learning algorithm in terms of accuracy. The experimental results show that on the convolutional neural network model, the federated transfer learning algorithm applying the attention mechanism has improved the accuracy compared with the federated transfer algorithm and the federated learning algorithm.

Keywords

Federated Learning, Transfer Learning, Attention Mechanisms

Cite This Paper

Siyuan Zhang, Guiquan Liu. Personalized Federated Learning with Attention Mechanisms. Academic Journal of Computing & Information Science (2023), Vol. 6, Issue 11: 95-101. https://doi.org/10.25236/AJCIS.2023.061113.

References

[1] Zhang B, Anderljung M, Kahn L, et al(2021). Ethics and governance of artificial intelligence: Evidence from a survey of machine learning researchers[J]. Journal of Artificial Intelligence Research, 71: 591–666-591–666. 

[2] Jokar M, Semperlotti F(2021). Finite element network analysis: A machine learning based computational framework for the simulation of physical systems[J]. Computers & Structures,247: 106484.

[3] Fuketa H, Uchiyama K(2021). Edge artificial intelligence chips for the cyberphysical systems era[J]. Computer, 54(1): 84-88.

[4] Ossewaarde M, Gulenc E(2020). National varieties of artificial intelligence discourses: Myth, utopianism, and solutionism in West European policy expectations[J]. Computer, 53(11): 53-61.

[5] Servadei L, Mosca E, Zennaro E, et al(2020). Accurate cost estimation of memory systems utilizing machine learning and solutions from computer vision for design automation[J]. IEEE Transactions on Computers, 69(6): 856-867.

[6] Stutz D, Geiger A(2020). Learning 3d shape completion under weak supervision[J]. International Journal of Computer Vision, 128: 1162-1181.

[7] Villamizar D A, Muratore D G, Wieser J B, et al(2021). An 800 nW switched-capacitor feature extraction filterbank for sound classification[J]. IEEE Transactions on Circuits and Systems I: Regular Papers, 68(4): 1578-1588.

[8] Koyama S, Brunnström J, Ito H, et al(2021). Spatial active noise control based on kernel interpolation of sound field[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 29: 3052-3063.

[9] Wang S, Pasi G, Hu L, et al(2020). The Era of Intelligent Recommendation: Editorial on Intelligent Recommendation with Advanced AI and Learning[J]. IEEE Intell. Syst., 35(5): 3-6.

[10] Chang L, Chen W, Huang J, et al(2021). Exploiting multi-attention network with contextual influence for point-of-interest recommendation[J]. Applied Intelligence,51: 1904-1917.

[11] Inkster N(2018). China’s cyber power[M]. Routledge.

[12] Voigt P, Von dem Bussche A(2017). The eu general data protection regulation (gdpr)[J]. A Practical Guide, 1st Ed., Cham: Springer International Publishing, 10(3152676): 10-5555.

[13] McMahan B, Moore E, Ramage D, et al(2017). Communication-efficient learning of deep networks from decentralized data[C]//Artificial intelligence and statistics. PMLR,1273-1282.

[14] Duan M, Liu D, Chen X, et al(2019). Astraea: Self-balancing federated learning for improving classification accuracy of mobile deep learning applications[C]//2019 IEEE 37th international conference on computer design (ICCD). IEEE, 246-254.

[15] Torrey L, Shavlik J(2010). Transfer learning[M]//Handbook of research on machine learning applications and trends: algorithms, methods, and techniques. IGI global, 242-264.

[16] Niu Z, Zhong G, Yu H(2021). A review on the attention mechanism of deep learning[J]. Neurocomputing, 452: 48-62.

[17] Woo S, Park J, Lee J Y, et al(2018). Cbam: Convolutional block attention module[C]//Proceedings of the European conference on computer vision (ECCV).  3-19.