The search functionality is under construction.
The search functionality is under construction.

Neural Architecture Search for Convolutional Neural Networks with Attention

Kohei NAKAI, Takashi MATSUBARA, Kuniaki UEHARA

  • Full Text Views

    0

  • Cite this

Summary :

The recent development of neural architecture search (NAS) has enabled us to automatically discover architectures of neural networks with high performance within a few days. Convolutional neural networks extract fruitful features by repeatedly applying standard operations (convolutions and poolings). However, these operations also extract useless or even disturbing features. Attention mechanisms enable neural networks to discard information of no interest, having achieved the state-of-the-art performance. While a variety of attentions for CNNs have been proposed, current NAS methods have paid a little attention to them. In this study, we propose a novel NAS method that searches attentions as well as operations. We examined several patterns to arrange attentions and operations, and found that attentions work better when they have their own search space and follow operations. We demonstrate the superior performance of our method in experiments on CIFAR-10, CIFAR-100, and ImageNet datasets. The found architecture achieved lower classification error rates and required fewer parameters compared to those found by current NAS methods.

Publication
IEICE TRANSACTIONS on Information Vol.E104-D No.2 pp.312-321
Publication Date
2021/02/01
Publicized
2020/10/26
Online ISSN
1745-1361
DOI
10.1587/transinf.2020EDP7111
Type of Manuscript
PAPER
Category
Image Recognition, Computer Vision

Authors

Kohei NAKAI
  Kobe University
Takashi MATSUBARA
  Osaka University
Kuniaki UEHARA
  Osaka Gakuin University

Keyword