원문정보
초록
영어
Recommendation systems are information-filtering systems that help users deal with in- formation overload. Unfortunately, current recommendation systems prompt serious privacy concerns. In this work, we propose an architecture that enables users to enhance their privacy in those systems that prole users on the basis of the items rated. Our approach capitalizes on a conceptually-simple perturbative technique, namely the suppression of ratings. In our scenario, users rate those items they have an opinion on. However, in order to avoid being accurately profiled, they may want to refrain from rating certain items. Consequently, this technique protects user privacy to a certain extent, but at the cost of a degradation in the accuracy of the recommendation. We measure privacy risk as the Kullback-Leibler divergence between the user's and the population's rating distribution, a privacy criterion that we proposed in previous work. The justification of such a criterion is our second contribution. Concretely, we thoroughly interpret it by elaborating on the intimate connection between the celebrated method of entropy maximization and the use of entropies and divergences as measures of privacy. The ultimate purpose of this justification is to attempt to bridge the gap between the privacy and the information-theoretic communities by substantially adapting some technicalities of our original work to reach a wider audience, not intimately familiar with information theory and the method of types. Lastly, we present a formulation of the optimal trade-off between privacy and suppression rate, what allows us to formally specify one of the functional blocks of the proposed architecture.
목차
1 Introduction
1.1 Contribution and Plan of this Paper
2 State of the Art
2.1 Privacy-Enhancing Mechanisms
2.2 Privacy Criteria
3 Statistical and Information-Theoretic Preliminaries
4 Privacy Protection in Recommendation Systems via the Sup-pressing of Ratings
4.1 User Profile
4.2 Adversarial Model
4.3 Privacy Metric
4.4 Architecture
5 Justification of Entropy and Divergence as Measures of Privacy
5.1 Rationale behind the Maximum Entropy Method
5.2 Measuring the Privacy of User Profiles
6 Formulation of the Trade-O Privacy and Suppression Rate
7 Concluding Remarks
Acknowledgments
References
