The search functionality is under construction.

IEICE TRANSACTIONS on Information

Classifying Near-Miss Traffic Incidents through Video, Sensor, and Object Features

Shuhei YAMAMOTO, Takeshi KURASHIMA, Hiroyuki TODA

  • Full Text Views

    0

  • Cite this

Summary :

Front video and sensor data captured by vehicle-mounted event recorders are used for not only traffic accident evidence but also safe-driving education as near-miss traffic incident data. However, most event recorder (ER) data shows only regular driving events. To utilize near-miss data for safe-driving education, we need to be able to easily and rapidly locate the appropriate data from large amounts of ER data through labels attached to the scenes/events of interest. This paper proposes a method that can automatically identify near-misses with objects such as pedestrians and bicycles by processing the ER data. The proposed method extracts two deep feature representations that consider car status and the environment surrounding the car. The first feature representation is generated by considering the temporal transitions of car status. The second one can extract the positional relationship between the car and surrounding objects by processing object detection results. Experiments on actual ER data demonstrate that the proposed method can accurately identify and tag near-miss events.

Publication
IEICE TRANSACTIONS on Information Vol.E105-D No.2 pp.377-386
Publication Date
2022/02/01
Publicized
2021/11/01
Online ISSN
1745-1361
DOI
10.1587/transinf.2021EDP7017
Type of Manuscript
PAPER
Category
Artificial Intelligence, Data Mining

Authors

Shuhei YAMAMOTO
  NTT Corporation
Takeshi KURASHIMA
  NTT Corporation
Hiroyuki TODA
  NTT Corporation

Keyword