Radar is expected in advanced driver-assistance systems for environmentally robust measurements. In this paper, we propose a novel radar signal segmentation method by using a complex-valued fully convolutional network (CvFCN) that comprises complex-valued layers, real-valued layers, and a bidirectional conversion layer between them. We also propose an efficient automatic annotation system for dataset generation. We apply the CvFCN to two-dimensional (2D) complex-valued radar signal maps (r-maps) that comprise angle and distance axes. An r-maps is a 2D complex-valued matrix that is generated from raw radar signals by 2D Fourier transformation. We annotate the r-maps automatically using LiDAR measurements. In our experiment, we semantically segment r-map signals into pedestrian and background regions, achieving accuracy of 99.7% for the background and 96.2% for pedestrians.
Motoko TACHIBANA
Oki Electric Industry Co., Ltd.
Kohei YAMAMOTO
Oki Electric Industry Co., Ltd.
Kurato MAENO
Oki Electric Industry Co., Ltd.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Motoko TACHIBANA, Kohei YAMAMOTO, Kurato MAENO, "Complex-Valued Fully Convolutional Networks for MIMO Radar Signal Segmentation" in IEICE TRANSACTIONS on Information,
vol. E101-D, no. 5, pp. 1445-1448, May 2018, doi: 10.1587/transinf.2017EDL8214.
Abstract: Radar is expected in advanced driver-assistance systems for environmentally robust measurements. In this paper, we propose a novel radar signal segmentation method by using a complex-valued fully convolutional network (CvFCN) that comprises complex-valued layers, real-valued layers, and a bidirectional conversion layer between them. We also propose an efficient automatic annotation system for dataset generation. We apply the CvFCN to two-dimensional (2D) complex-valued radar signal maps (r-maps) that comprise angle and distance axes. An r-maps is a 2D complex-valued matrix that is generated from raw radar signals by 2D Fourier transformation. We annotate the r-maps automatically using LiDAR measurements. In our experiment, we semantically segment r-map signals into pedestrian and background regions, achieving accuracy of 99.7% for the background and 96.2% for pedestrians.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2017EDL8214/_p
Copy
@ARTICLE{e101-d_5_1445,
author={Motoko TACHIBANA, Kohei YAMAMOTO, Kurato MAENO, },
journal={IEICE TRANSACTIONS on Information},
title={Complex-Valued Fully Convolutional Networks for MIMO Radar Signal Segmentation},
year={2018},
volume={E101-D},
number={5},
pages={1445-1448},
abstract={Radar is expected in advanced driver-assistance systems for environmentally robust measurements. In this paper, we propose a novel radar signal segmentation method by using a complex-valued fully convolutional network (CvFCN) that comprises complex-valued layers, real-valued layers, and a bidirectional conversion layer between them. We also propose an efficient automatic annotation system for dataset generation. We apply the CvFCN to two-dimensional (2D) complex-valued radar signal maps (r-maps) that comprise angle and distance axes. An r-maps is a 2D complex-valued matrix that is generated from raw radar signals by 2D Fourier transformation. We annotate the r-maps automatically using LiDAR measurements. In our experiment, we semantically segment r-map signals into pedestrian and background regions, achieving accuracy of 99.7% for the background and 96.2% for pedestrians.},
keywords={},
doi={10.1587/transinf.2017EDL8214},
ISSN={1745-1361},
month={May},}
Copy
TY - JOUR
TI - Complex-Valued Fully Convolutional Networks for MIMO Radar Signal Segmentation
T2 - IEICE TRANSACTIONS on Information
SP - 1445
EP - 1448
AU - Motoko TACHIBANA
AU - Kohei YAMAMOTO
AU - Kurato MAENO
PY - 2018
DO - 10.1587/transinf.2017EDL8214
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E101-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2018
AB - Radar is expected in advanced driver-assistance systems for environmentally robust measurements. In this paper, we propose a novel radar signal segmentation method by using a complex-valued fully convolutional network (CvFCN) that comprises complex-valued layers, real-valued layers, and a bidirectional conversion layer between them. We also propose an efficient automatic annotation system for dataset generation. We apply the CvFCN to two-dimensional (2D) complex-valued radar signal maps (r-maps) that comprise angle and distance axes. An r-maps is a 2D complex-valued matrix that is generated from raw radar signals by 2D Fourier transformation. We annotate the r-maps automatically using LiDAR measurements. In our experiment, we semantically segment r-map signals into pedestrian and background regions, achieving accuracy of 99.7% for the background and 96.2% for pedestrians.
ER -