The search functionality is under construction.
The search functionality is under construction.

Feature Selection via 1-Penalized Squared-Loss Mutual Information

Wittawat JITKRITTUM, Hirotaka HACHIYA, Masashi SUGIYAMA

  • Full Text Views

    0

  • Cite this

Summary :

Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose 1-LSMI, an 1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that 1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction.

Publication
IEICE TRANSACTIONS on Information Vol.E96-D No.7 pp.1513-1524
Publication Date
2013/07/01
Publicized
Online ISSN
1745-1361
DOI
10.1587/transinf.E96.D.1513
Type of Manuscript
PAPER
Category
Pattern Recognition

Authors

Wittawat JITKRITTUM
  Tokyo Institute of Technology
Hirotaka HACHIYA
  Tokyo Institute of Technology
Masashi SUGIYAMA
  Tokyo Institute of Technology

Keyword