The search functionality is under construction.
The search functionality is under construction.

Learning from Noisy Complementary Labels with Robust Loss Functions

Hiroki ISHIGURO, Takashi ISHIDA, Masashi SUGIYAMA

  • Full Text Views

    0

  • Cite this

Summary :

It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.

Publication
IEICE TRANSACTIONS on Information Vol.E105-D No.2 pp.364-376
Publication Date
2022/02/01
Publicized
2021/11/01
Online ISSN
1745-1361
DOI
10.1587/transinf.2021EDP7035
Type of Manuscript
PAPER
Category
Artificial Intelligence, Data Mining

Authors

Hiroki ISHIGURO
  University of Tokyo
Takashi ISHIDA
  University of Tokyo,RIKEN
Masashi SUGIYAMA
  University of Tokyo,RIKEN

Keyword