ﻻ يوجد ملخص باللغة العربية
In this paper, we study the problem of privacy-preserving data sharing, wherein only a subset of the records in a database are sensitive, possibly based on predefined privacy policies. Existing solutions, viz, differential privacy (DP), are over-pessimistic and treat all information as sensitive. Alternatively, techniques, like access control and personalized differential privacy, reveal all non-sensitive records truthfully, and they indirectly leak information about sensitive records through exclusion attacks. Motivated by the limitations of prior work, we introduce the notion of one-sided differential privacy (OSDP). We formalize the exclusion attack and we show how OSDP protects against it. OSDP offers differential privacy like guarantees, but only to the sensitive records. OSDP allows the truthful release of a subset of the non-sensitive records. The sample can be used to support applications that must output true data, and is well suited for publishing complex types of data, e.g. trajectories. Though some non-sensitive records are suppressed to avoid exclusion attacks, our experiments show that the suppression results in a small loss in utility in most cases. Additionally, we present a recipe for turning DP mechanisms for answering counting queries into OSDP techniques for the same task. Our OSDP algorithms leverage the presence of non-sensitive records and are able to offer up to a 25x improvement in accuracy over state-of-the-art DP-solutions.
The massive collection of personal data by personalization systems has rendered the preservation of privacy of individuals more and more difficult. Most of the proposed approaches to preserve privacy in personalization systems usually address this is
Many data applications have certain invariant constraints due to practical needs. Data curators who employ differential privacy need to respect such constraints on the sanitized data product as a primary utility requirement. Invariants challenge the
In this rejoinder, we aim to address two broad issues that cover most comments made in the discussion. First, we discuss some theoretical aspects of our work and comment on how this work might impact the theoretical foundation of privacy-preserving d
We initiate a study of the composition properties of interactive differentially private mechanisms. An interactive differentially private mechanism is an algorithm that allows an analyst to adaptively ask queries about a sensitive dataset, with the p
The calibration of noise for a privacy-preserving mechanism depends on the sensitivity of the query and the prescribed privacy level. A data steward must make the non-trivial choice of a privacy level that balances the requirements of users and the m