The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] differential privacy(7hit)

1-7hit
  • Privacy-Preserving Correlation Coefficient

    Tomoaki MIMOTO  Hiroyuki YOKOYAMA  Toru NAKAMURA  Takamasa ISOHARA  Masayuki HASHIMOTO  Ryosuke KOJIMA  Aki HASEGAWA  Yasushi OKUNO  

     
    PAPER

      Pubricized:
    2023/02/08
      Vol:
    E106-D No:5
      Page(s):
    868-876

    Differential privacy is a confidentiality metric and quantitatively guarantees the confidentiality of individuals. A noise criterion, called sensitivity, must be calculated when constructing a probabilistic disturbance mechanism that satisfies differential privacy. Depending on the statistical process, the sensitivity may be very large or even impossible to compute. As a result, the usefulness of the constructed mechanism may be significantly low; it might even be impossible to directly construct it. In this paper, we first discuss situations in which sensitivity is difficult to calculate, and then propose a differential privacy with additional dummy data as a countermeasure. When the sensitivity in the conventional differential privacy is calculable, a mechanism that satisfies the proposed metric satisfies the conventional differential privacy at the same time, and it is possible to evaluate the relationship between the respective privacy parameters. Next, we derive sensitivity by focusing on correlation coefficients as a case study of a statistical process for which sensitivity is difficult to calculate, and propose a probabilistic disturbing mechanism that satisfies the proposed metric. Finally, we experimentally evaluate the effect of noise on the sensitivity of the proposed and direct methods. Experiments show that privacy-preserving correlation coefficients can be derived with less noise compared to using direct methods.

  • Geo-Graph-Indistinguishability: Location Privacy on Road Networks with Differential Privacy

    Shun TAKAGI  Yang CAO  Yasuhito ASANO  Masatoshi YOSHIKAWA  

     
    PAPER

      Pubricized:
    2023/01/16
      Vol:
    E106-D No:5
      Page(s):
    877-894

    In recent years, concerns about location privacy are increasing with the spread of location-based services (LBSs). Many methods to protect location privacy have been proposed in the past decades. Especially, perturbation methods based on Geo-Indistinguishability (GeoI), which randomly perturb a true location to a pseudolocation, are getting attention due to its strong privacy guarantee inherited from differential privacy. However, GeoI is based on the Euclidean plane even though many LBSs are based on road networks (e.g. ride-sharing services). This causes unnecessary noise and thus an insufficient tradeoff between utility and privacy for LBSs on road networks. To address this issue, we propose a new privacy notion, Geo-Graph-Indistinguishability (GeoGI), for locations on a road network to achieve a better tradeoff. We propose Graph-Exponential Mechanism (GEM), which satisfies GeoGI. Moreover, we formalize the optimization problem to find the optimal GEM in terms of the tradeoff. However, the computational complexity of a naive method to find the optimal solution is prohibitive, so we propose a greedy algorithm to find an approximate solution in an acceptable amount of time. Finally, our experiments show that our proposed mechanism outperforms GeoI mechanisms, including optimal GeoI mechanism, with respect to the tradeoff.

  • Locally Differentially Private Minimum Finding

    Kazuto FUKUCHI  Chia-Mu YU  Jun SAKUMA  

     
    PAPER-Artificial Intelligence, Data Mining

      Pubricized:
    2022/05/11
      Vol:
    E105-D No:8
      Page(s):
    1418-1430

    We investigate a problem of finding the minimum, in which each user has a real value, and we want to estimate the minimum of these values under the local differential privacy constraint. We reveal that this problem is fundamentally difficult, and we cannot construct a consistent mechanism in the worst case. Instead of considering the worst case, we aim to construct a private mechanism whose error rate is adaptive to the easiness of estimation of the minimum. As a measure of easiness, we introduce a parameter α that characterizes the fatness of the minimum-side tail of the user data distribution. As a result, we reveal that the mechanism can achieve O((ln6N/ε2N)1/2α) error without knowledge of α and the error rate is near-optimal in the sense that any mechanism incurs Ω((1/ε2N)1/2α) error. Furthermore, we demonstrate that our mechanism outperforms a naive mechanism by empirical evaluations on synthetic datasets. Also, we conducted experiments on the MovieLens dataset and a purchase history dataset and demonstrate that our algorithm achieves Õ((1/N)1/2α) error adaptively to α.

  • Differentially Private Neural Networks with Bounded Activation Function

    Kijung JUNG  Hyukki LEE  Yon Dohn CHUNG  

     
    LETTER-Artificial Intelligence, Data Mining

      Pubricized:
    2021/03/18
      Vol:
    E104-D No:6
      Page(s):
    905-908

    Deep learning has shown outstanding performance in various fields, and it is increasingly deployed in privacy-critical domains. If sensitive data in the deep learning model are exposed, it can cause serious privacy threats. To protect individual privacy, we propose a novel activation function and stochastic gradient descent for applying differential privacy to deep learning. Through experiments, we show that the proposed method can effectively protect the privacy and the performance of proposed method is better than the previous approaches.

  • Sparsity Reduction Technique Using Grouping Method for Matrix Factorization in Differentially Private Recommendation Systems

    Taewhan KIM  Kangsoo JUNG  Seog PARK  

     
    PAPER-Artificial Intelligence, Data Mining

      Pubricized:
    2020/04/01
      Vol:
    E103-D No:7
      Page(s):
    1683-1692

    Web service users are overwhelmed by the amount of information presented to them and have difficulties in finding the information that they need. Therefore, a recommendation system that predicts users' taste is an essential factor for the success of businesses. However, recommendation systems require users' personal information and can thus lead to serious privacy violations. To solve this problem, many research has been conducted about protecting personal information in recommendation systems and implementing differential privacy, a privacy protection technique that inserts noise into the original data. However, previous studies did not examine the following factors in applying differential privacy to recommendation systems. First, they did not consider the sparsity of user rating information. The total number of items is much more than the number of user-rated items. Therefore, a rating matrix created for users and items will be very sparse. This characteristic renders the identification of user patterns in rating matrixes difficult. Therefore, the sparsity issue should be considered in the application of differential privacy to recommendation systems. Second, previous studies focused on protecting user rating information but did not aim to protect the lists of user-rated items. Recommendation systems should protect these item lists because they also disclose user preferences. In this study, we propose a differentially private recommendation scheme that bases on a grouping method to solve the sparsity issue and to protect user-rated item lists and user rating information. The proposed technique shows better performance and privacy protection on actual movie rating data in comparison with an existing technique.

  • Input and Output Privacy-Preserving Linear Regression

    Yoshinori AONO  Takuya HAYASHI  Le Trieu PHONG  Lihua WANG  

     
    PAPER-Privacy, anonymity, and fundamental theory

      Pubricized:
    2017/07/21
      Vol:
    E100-D No:10
      Page(s):
    2339-2347

    We build a privacy-preserving system of linear regression protecting both input data secrecy and output privacy. Our system achieves those goals simultaneously via a novel combination of homomorphic encryption and differential privacy dedicated to linear regression and its variants (ridge, LASSO). Our system is proved scalable over cloud servers, and its efficiency is extensively checked by careful experiments.

  • Differentially Private Real-Time Data Publishing over Infinite Trajectory Streams

    Yang CAO  Masatoshi YOSHIKAWA  

     
    PAPER-Data Engineering, Web Information Systems

      Pubricized:
    2015/10/06
      Vol:
    E99-D No:1
      Page(s):
    163-175

    Recent emerging mobile and wearable technologies make it easy to collect personal spatiotemporal data such as activity trajectories in daily life. Publishing real-time statistics over trajectory streams produced by crowds of people is expected to be valuable for both academia and business, answering questions such as “How many people are in Kyoto Station now?” However, analyzing these raw data will entail risks of compromising individual privacy. ε-Differential Privacy has emerged as a well-known standard for private statistics publishing because of its guarantee of being rigorous and mathematically provable. However, since user trajectories will be generated infinitely, it is difficult to protect every trajectory under ε-differential privacy. On the other hand, in real life, not all users require the same level of privacy. To this end, we propose a flexible privacy model of l-trajectory privacy to ensure every desired length of trajectory under protection of ε-differential privacy. We also design an algorithmic framework to publish l-trajectory private data in real time. Experiments using four real-life datasets show that our proposed algorithms are effective and efficient.