The search functionality is under construction.
The search functionality is under construction.

Sound Event Detection Utilizing Graph Laplacian Regularization with Event Co-Occurrence

Keisuke IMOTO, Seisuke KYOCHI

  • Full Text Views

    0

  • Cite this

Summary :

A limited number of types of sound event occur in an acoustic scene and some sound events tend to co-occur in the scene; for example, the sound events “dishes” and “glass jingling” are likely to co-occur in the acoustic scene “cooking.” In this paper, we propose a method of sound event detection using graph Laplacian regularization with sound event co-occurrence taken into account. In the proposed method, the occurrences of sound events are expressed as a graph whose nodes indicate the frequencies of event occurrence and whose edges indicate the sound event co-occurrences. This graph representation is then utilized for the model training of sound event detection, which is optimized under an objective function with a regularization term considering the graph structure of sound event occurrence and co-occurrence. Evaluation experiments using the TUT Sound Events 2016 and 2017 detasets, and the TUT Acoustic Scenes 2016 dataset show that the proposed method improves the performance of sound event detection by 7.9 percentage points compared with the conventional CNN-BiGRU-based detection method in terms of the segment-based F1 score. In particular, the experimental results indicate that the proposed method enables the detection of co-occurring sound events more accurately than the conventional method.

Publication
IEICE TRANSACTIONS on Information Vol.E103-D No.9 pp.1971-1977
Publication Date
2020/09/01
Publicized
2020/06/08
Online ISSN
1745-1361
DOI
10.1587/transinf.2019EDP7323
Type of Manuscript
PAPER
Category
Speech and Hearing

Authors

Keisuke IMOTO
  Ritsumeikan University
Seisuke KYOCHI
  University of Kitakyushu

Keyword