The search functionality is under construction.
The search functionality is under construction.

Author Search Result

[Author] Janya SAINUI(2hit)

1-2hit
  • Direct Approximation of Quadratic Mutual Information and Its Application to Dependence-Maximization Clustering

    Janya SAINUI  Masashi SUGIYAMA  

     
    LETTER-Artificial Intelligence, Data Mining

      Vol:
    E96-D No:10
      Page(s):
    2282-2285

    Mutual information (MI) is a standard measure of statistical dependence of random variables. However, due to the log function and the ratio of probability densities included in MI, it is sensitive to outliers. On the other hand, the L2-distance variant of MI called quadratic MI (QMI) tends to be robust against outliers because QMI is just the integral of the squared difference between the joint density and the product of marginals. In this paper, we propose a kernel least-squares QMI estimator called least-squares QMI (LSQMI) that directly estimates the density difference without estimating each density. A notable advantage of LSQMI is that its solution can be analytically and efficiently computed just by solving a system of linear equations. We then apply LSQMI to dependence-maximization clustering, and demonstrate its usefulness experimentally.

  • Unsupervised Dimension Reduction via Least-Squares Quadratic Mutual Information

    Janya SAINUI  Masashi SUGIYAMA  

     
    LETTER-Artificial Intelligence, Data Mining

      Pubricized:
    2014/07/22
      Vol:
    E97-D No:10
      Page(s):
    2806-2809

    The goal of dimension reduction is to represent high-dimensional data in a lower-dimensional subspace, while intrinsic properties of the original data are kept as much as possible. An important challenge in unsupervised dimension reduction is the choice of tuning parameters, because no supervised information is available and thus parameter selection tends to be subjective and heuristic. In this paper, we propose an information-theoretic approach to unsupervised dimension reduction that allows objective tuning parameter selection. We employ quadratic mutual information (QMI) as our information measure, which is known to be less sensitive to outliers than ordinary mutual information, and QMI is estimated analytically by a least-squares method in a computationally efficient way. Then, we provide an eigenvector-based efficient implementation for performing unsupervised dimension reduction based on the QMI estimator. The usefulness of the proposed method is demonstrated through experiments.