Autonomous driving is not only required to detect pedestrians around vehicles, but also expected to understand the behaviors of pedestrians. Pedestrian body orientation and head orientation are the relevant indicators of the pedestrian intention. This paper proposes an accurate estimation system to recognize the pedestrian body orientation and the pedestrian head orientation from on-board camera and inertial sensors. The proposed system discretizes the body orientation and the head orientation into 16 directions. In order to achieve the accurate orientation estimation, a novel training database is established, which includes strongly labeled data and weakly labeled data. Semi-Supervised Learning method is employed to annotate the weakly labeled data, and to generate the accurate classifier based on the proposed training database. In addition, the temporal constraint and the human physical model constraint are considered in orientation estimation, which are beneficial to the reasonable and stable result of orientation estimation for the pedestrian in image sequences. This estimated result is the orientation in camera space. The comprehension of the pedestrian behavior needs to be conducted in the real world space. Therefore, this paper proposes to model the motion of the host vehicle using inertial sensor, then transforms the estimated orientation from camera space to the real world space by considering the vehicle and pedestrian motion. The represented orientation indicates the behavior of the pedestrian more directly. Finally, a series of experiments demonstrate the effectiveness of the proposed pedestrian orientation system.
Yanlei GU
The University of Tokyo
Li-Ta HSU
The University of Tokyo
Lijia XIE
The University of Tokyo
Shunsuke KAMIJO
The University of Tokyo
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yanlei GU, Li-Ta HSU, Lijia XIE, Shunsuke KAMIJO, "Accurate Estimation of Pedestrian Orientation from On-Board Camera and Inertial Sensors" in IEICE TRANSACTIONS on Fundamentals,
vol. E99-A, no. 1, pp. 271-281, January 2016, doi: 10.1587/transfun.E99.A.271.
Abstract: Autonomous driving is not only required to detect pedestrians around vehicles, but also expected to understand the behaviors of pedestrians. Pedestrian body orientation and head orientation are the relevant indicators of the pedestrian intention. This paper proposes an accurate estimation system to recognize the pedestrian body orientation and the pedestrian head orientation from on-board camera and inertial sensors. The proposed system discretizes the body orientation and the head orientation into 16 directions. In order to achieve the accurate orientation estimation, a novel training database is established, which includes strongly labeled data and weakly labeled data. Semi-Supervised Learning method is employed to annotate the weakly labeled data, and to generate the accurate classifier based on the proposed training database. In addition, the temporal constraint and the human physical model constraint are considered in orientation estimation, which are beneficial to the reasonable and stable result of orientation estimation for the pedestrian in image sequences. This estimated result is the orientation in camera space. The comprehension of the pedestrian behavior needs to be conducted in the real world space. Therefore, this paper proposes to model the motion of the host vehicle using inertial sensor, then transforms the estimated orientation from camera space to the real world space by considering the vehicle and pedestrian motion. The represented orientation indicates the behavior of the pedestrian more directly. Finally, a series of experiments demonstrate the effectiveness of the proposed pedestrian orientation system.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.E99.A.271/_p
Copy
@ARTICLE{e99-a_1_271,
author={Yanlei GU, Li-Ta HSU, Lijia XIE, Shunsuke KAMIJO, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Accurate Estimation of Pedestrian Orientation from On-Board Camera and Inertial Sensors},
year={2016},
volume={E99-A},
number={1},
pages={271-281},
abstract={Autonomous driving is not only required to detect pedestrians around vehicles, but also expected to understand the behaviors of pedestrians. Pedestrian body orientation and head orientation are the relevant indicators of the pedestrian intention. This paper proposes an accurate estimation system to recognize the pedestrian body orientation and the pedestrian head orientation from on-board camera and inertial sensors. The proposed system discretizes the body orientation and the head orientation into 16 directions. In order to achieve the accurate orientation estimation, a novel training database is established, which includes strongly labeled data and weakly labeled data. Semi-Supervised Learning method is employed to annotate the weakly labeled data, and to generate the accurate classifier based on the proposed training database. In addition, the temporal constraint and the human physical model constraint are considered in orientation estimation, which are beneficial to the reasonable and stable result of orientation estimation for the pedestrian in image sequences. This estimated result is the orientation in camera space. The comprehension of the pedestrian behavior needs to be conducted in the real world space. Therefore, this paper proposes to model the motion of the host vehicle using inertial sensor, then transforms the estimated orientation from camera space to the real world space by considering the vehicle and pedestrian motion. The represented orientation indicates the behavior of the pedestrian more directly. Finally, a series of experiments demonstrate the effectiveness of the proposed pedestrian orientation system.},
keywords={},
doi={10.1587/transfun.E99.A.271},
ISSN={1745-1337},
month={January},}
Copy
TY - JOUR
TI - Accurate Estimation of Pedestrian Orientation from On-Board Camera and Inertial Sensors
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 271
EP - 281
AU - Yanlei GU
AU - Li-Ta HSU
AU - Lijia XIE
AU - Shunsuke KAMIJO
PY - 2016
DO - 10.1587/transfun.E99.A.271
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E99-A
IS - 1
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - January 2016
AB - Autonomous driving is not only required to detect pedestrians around vehicles, but also expected to understand the behaviors of pedestrians. Pedestrian body orientation and head orientation are the relevant indicators of the pedestrian intention. This paper proposes an accurate estimation system to recognize the pedestrian body orientation and the pedestrian head orientation from on-board camera and inertial sensors. The proposed system discretizes the body orientation and the head orientation into 16 directions. In order to achieve the accurate orientation estimation, a novel training database is established, which includes strongly labeled data and weakly labeled data. Semi-Supervised Learning method is employed to annotate the weakly labeled data, and to generate the accurate classifier based on the proposed training database. In addition, the temporal constraint and the human physical model constraint are considered in orientation estimation, which are beneficial to the reasonable and stable result of orientation estimation for the pedestrian in image sequences. This estimated result is the orientation in camera space. The comprehension of the pedestrian behavior needs to be conducted in the real world space. Therefore, this paper proposes to model the motion of the host vehicle using inertial sensor, then transforms the estimated orientation from camera space to the real world space by considering the vehicle and pedestrian motion. The represented orientation indicates the behavior of the pedestrian more directly. Finally, a series of experiments demonstrate the effectiveness of the proposed pedestrian orientation system.
ER -