ﻻ يوجد ملخص باللغة العربية
In this letter, we propose a multi-task over-theair federated learning (MOAFL) framework, where multiple learning tasks share edge devices for data collection and learning models under the coordination of a edge server (ES). Specially, the model updates for all the tasks are transmitted and superpositioned concurrently over a non-orthogonal uplink channel via over-the-air computation, and the aggregation results of all the tasks are reconstructed at the ES through an extended version of the turbo compressed sensing algorithm. Both the convergence analysis and numerical results demonstrate that the MOAFL framework can significantly reduce the uplink bandwidth consumption of multiple tasks without causing substantial learning performance degradation.
With the aim of integrating over-the-air federated learning (AirFL) and non-orthogonal multiple access (NOMA) into an on-demand universal framework, this paper proposes a novel reconfigurable intelligent surface (RIS)-aided hybrid network by leveragi
A new machine learning (ML) technique termed as federated learning (FL) aims to preserve data at the edge devices and to only exchange ML model parameters in the learning process. FL not only reduces the communication needs but also helps to protect
Machine learning and wireless communication technologies are jointly facilitating an intelligent edge, where federated edge learning (FEEL) is a promising training framework. As wireless devices involved in FEEL are resource limited in terms of commu
Many problems in machine learning rely on multi-task learning (MTL), in which the goal is to solve multiple related machine learning tasks simultaneously. MTL is particularly relevant for privacy-sensitive applications in areas such as healthcare, fi
Federated multi-task learning (FMTL) has emerged as a natural choice to capture the statistical diversity among the clients in federated learning. To unleash the potential of FMTL beyond statistical diversity, we formulate a new FMTL problem FedU usi