Recently, Intelligent Transport Systems (ITS) are being researched and developed briskly. As a part of ITS, detecting vehicles from images taken by a camera loaded on a vehicle are conducted. From such backgrounds, authors have been conducting vehicle detection in nighttime. To evaluate the accuracy of this detection, gold standards of the detection are required. At present, gold standards are created manually, but manually detecting vehicles take time. Accordingly, a system which detects vehicles accurately without human help is needed to evaluate the accuracy of the vehicle detection in real time. Therefore the purpose of this study is to automatically detect vehicles in nighttime images, taken by an in-vehicle camera, with high accuracy in offline processing. To detect vehicles we focused on the brightness of the headlights and taillights, because it is difficult to detect vehicles from their shape in nighttime driving scenes. The method we propose uses Center Surround Extremas, called CenSurE for short, to detect blobs. CenSurE is a method that uses the difference in brightness between the lights and the surroundings. However, blobs obtained by CenSurE will also include objects other than headlights and taillights. For example, streetlights and delineators would be detected. To distinguish such blobs, they are tracked in inverse time and vehicles are detected using tags based on the characteristics of each object. Although every object appears from the same point in forward time process, each object appears from different places in images in inverse time processing, allowing it to track and tag blobs easily. To evaluate the effectiveness of this proposed method, experiment of detecting vehicles was conducted using nighttime driving scenes taken by a camera loaded on a vehicle. Experimental results of the proposed method were nearly equivalent to manual detection.
Naoya KOSAKA
Shizuoka University
Ryota OGURA
the Koito Manufacturing Co., LTD
Gosuke OHASHI
Shizuoka University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Naoya KOSAKA, Ryota OGURA, Gosuke OHASHI, "Offline Vehicle Detection at Night Using Center Surround Extremas" in IEICE TRANSACTIONS on Fundamentals,
vol. E98-A, no. 8, pp. 1727-1734, August 2015, doi: 10.1587/transfun.E98.A.1727.
Abstract: Recently, Intelligent Transport Systems (ITS) are being researched and developed briskly. As a part of ITS, detecting vehicles from images taken by a camera loaded on a vehicle are conducted. From such backgrounds, authors have been conducting vehicle detection in nighttime. To evaluate the accuracy of this detection, gold standards of the detection are required. At present, gold standards are created manually, but manually detecting vehicles take time. Accordingly, a system which detects vehicles accurately without human help is needed to evaluate the accuracy of the vehicle detection in real time. Therefore the purpose of this study is to automatically detect vehicles in nighttime images, taken by an in-vehicle camera, with high accuracy in offline processing. To detect vehicles we focused on the brightness of the headlights and taillights, because it is difficult to detect vehicles from their shape in nighttime driving scenes. The method we propose uses Center Surround Extremas, called CenSurE for short, to detect blobs. CenSurE is a method that uses the difference in brightness between the lights and the surroundings. However, blobs obtained by CenSurE will also include objects other than headlights and taillights. For example, streetlights and delineators would be detected. To distinguish such blobs, they are tracked in inverse time and vehicles are detected using tags based on the characteristics of each object. Although every object appears from the same point in forward time process, each object appears from different places in images in inverse time processing, allowing it to track and tag blobs easily. To evaluate the effectiveness of this proposed method, experiment of detecting vehicles was conducted using nighttime driving scenes taken by a camera loaded on a vehicle. Experimental results of the proposed method were nearly equivalent to manual detection.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.E98.A.1727/_p
Copy
@ARTICLE{e98-a_8_1727,
author={Naoya KOSAKA, Ryota OGURA, Gosuke OHASHI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Offline Vehicle Detection at Night Using Center Surround Extremas},
year={2015},
volume={E98-A},
number={8},
pages={1727-1734},
abstract={Recently, Intelligent Transport Systems (ITS) are being researched and developed briskly. As a part of ITS, detecting vehicles from images taken by a camera loaded on a vehicle are conducted. From such backgrounds, authors have been conducting vehicle detection in nighttime. To evaluate the accuracy of this detection, gold standards of the detection are required. At present, gold standards are created manually, but manually detecting vehicles take time. Accordingly, a system which detects vehicles accurately without human help is needed to evaluate the accuracy of the vehicle detection in real time. Therefore the purpose of this study is to automatically detect vehicles in nighttime images, taken by an in-vehicle camera, with high accuracy in offline processing. To detect vehicles we focused on the brightness of the headlights and taillights, because it is difficult to detect vehicles from their shape in nighttime driving scenes. The method we propose uses Center Surround Extremas, called CenSurE for short, to detect blobs. CenSurE is a method that uses the difference in brightness between the lights and the surroundings. However, blobs obtained by CenSurE will also include objects other than headlights and taillights. For example, streetlights and delineators would be detected. To distinguish such blobs, they are tracked in inverse time and vehicles are detected using tags based on the characteristics of each object. Although every object appears from the same point in forward time process, each object appears from different places in images in inverse time processing, allowing it to track and tag blobs easily. To evaluate the effectiveness of this proposed method, experiment of detecting vehicles was conducted using nighttime driving scenes taken by a camera loaded on a vehicle. Experimental results of the proposed method were nearly equivalent to manual detection.},
keywords={},
doi={10.1587/transfun.E98.A.1727},
ISSN={1745-1337},
month={August},}
Copy
TY - JOUR
TI - Offline Vehicle Detection at Night Using Center Surround Extremas
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1727
EP - 1734
AU - Naoya KOSAKA
AU - Ryota OGURA
AU - Gosuke OHASHI
PY - 2015
DO - 10.1587/transfun.E98.A.1727
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E98-A
IS - 8
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - August 2015
AB - Recently, Intelligent Transport Systems (ITS) are being researched and developed briskly. As a part of ITS, detecting vehicles from images taken by a camera loaded on a vehicle are conducted. From such backgrounds, authors have been conducting vehicle detection in nighttime. To evaluate the accuracy of this detection, gold standards of the detection are required. At present, gold standards are created manually, but manually detecting vehicles take time. Accordingly, a system which detects vehicles accurately without human help is needed to evaluate the accuracy of the vehicle detection in real time. Therefore the purpose of this study is to automatically detect vehicles in nighttime images, taken by an in-vehicle camera, with high accuracy in offline processing. To detect vehicles we focused on the brightness of the headlights and taillights, because it is difficult to detect vehicles from their shape in nighttime driving scenes. The method we propose uses Center Surround Extremas, called CenSurE for short, to detect blobs. CenSurE is a method that uses the difference in brightness between the lights and the surroundings. However, blobs obtained by CenSurE will also include objects other than headlights and taillights. For example, streetlights and delineators would be detected. To distinguish such blobs, they are tracked in inverse time and vehicles are detected using tags based on the characteristics of each object. Although every object appears from the same point in forward time process, each object appears from different places in images in inverse time processing, allowing it to track and tag blobs easily. To evaluate the effectiveness of this proposed method, experiment of detecting vehicles was conducted using nighttime driving scenes taken by a camera loaded on a vehicle. Experimental results of the proposed method were nearly equivalent to manual detection.
ER -