The search functionality is under construction.

IEICE TRANSACTIONS on Information

Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting

Masashi SUGIYAMA

  • Full Text Views

    0

  • Cite this

Summary :

Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training--typically carried out by (quasi-)Newton methods--is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.

Publication
IEICE TRANSACTIONS on Information Vol.E93-D No.10 pp.2690-2701
Publication Date
2010/10/01
Publicized
Online ISSN
1745-1361
DOI
10.1587/transinf.E93.D.2690
Type of Manuscript
Special Section PAPER (Special Section on Data Mining and Statistical Science)
Category

Authors

Keyword