ﻻ يوجد ملخص باللغة العربية
In this paper, we study the incentive mechanism design for real-time data aggregation, which holds a large spectrum of crowdsensing applications. Despite extensive studies on static incentive mechanisms, none of these are applicable to real-time data aggregation due to their incapability of maintaining PUs long-term participation. We emphasize that, to maintain PUs long-term participation, it is of significant importance to protect their privacy as well as to provide them a desirable cumulative compensation. Thus motivated, in this paper, we propose LEPA, an efficient incentive mechanism to stimulate long-term participation in real-time data aggregation. Specifically, we allow PUs to preserve their privacy by reporting noisy data, the impact of which on the aggregation accuracy is quantified with proper privacy and accuracy measures. Then, we provide a framework that jointly optimizes the incentive schemes in different time slots to ensure desirable cumulative compensation for PUs and thereby prevent PUs from leaving the system halfway. Considering PUs strategic behaviors and combinatorial nature of the sensing tasks, we propose a computationally efficient on-line auction with close-to-optimal performance in presence of NP-hardness of winner user selection. We further show that the proposed on-line auction satisfies desirable properties of truthfulness and individual rationality. The performance of LEPA is validated by both theoretical analysis and extensive simulations.
Incentive mechanism plays a critical role in privacy-aware crowdsensing. Most previous studies on co-design of incentive mechanism and privacy preservation assume a trustworthy fusion center (FC). Very recent work has taken steps to relax the assumpt
Mobile crowdsensing (MCS) is an emerging sensing data collection pattern with scalability, low deployment cost, and distributed characteristics. Traditional MCS systems suffer from privacy concerns and fair reward distribution. Moreover, existing pri
Recently, Google and other 24 institutions proposed a series of open challenges towards federated learning (FL), which include application expansion and homomorphic encryption (HE). The former aims to expand the applicable machine learning models of
We study a problem of privacy-preserving mechanism design. A data collector wants to obtain data from individuals to perform some computations. To relieve the privacy threat to the contributors, the data collector adopts a privacy-preserving mechanis
Federated learning (FL) has enabled training models collaboratively from multiple data owning parties without sharing their data. Given the privacy regulations of patients healthcare data, learning-based systems in healthcare can greatly benefit from