We present a human point of gaze estimation system using corneal surface reflection and omnidirectional image taken by spherical panorama cameras, which becomes popular recent years. Our system enables to find where a user is looking at only from an eye image in a 360° surrounding scene image, thus, does not need gaze mapping from partial scene images to a whole scene image that are necessary in conventional eye gaze tracking system. We first generate multiple perspective scene images from an omnidirectional (equirectangular) image and perform registration between the corneal reflection and perspective images using a corneal reflection-scene image registration technique. We then compute the point of gaze using a corneal imaging technique leveraged by a 3D eye model, and project the point to an omnidirectional image. The 3D eye pose is estimate by using the particle-filter-based tracking algorithm. In experiments, we evaluated the accuracy of the 3D eye pose estimation, robustness of registration and accuracy of PoG estimations using two indoor and five outdoor scenes, and found that gaze mapping error was 5.546 [deg] on average.
Taishi OGAWA
Kyoto University
Atsushi NAKAZAWA
Kyoto University
Toyoaki NISHIDA
Kyoto University,RIKEN Center for Advanced Intelligence Project
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Taishi OGAWA, Atsushi NAKAZAWA, Toyoaki NISHIDA, "Point of Gaze Estimation Using Corneal Surface Reflection and Omnidirectional Camera Image" in IEICE TRANSACTIONS on Information,
vol. E101-D, no. 5, pp. 1278-1287, May 2018, doi: 10.1587/transinf.2017MVP0020.
Abstract: We present a human point of gaze estimation system using corneal surface reflection and omnidirectional image taken by spherical panorama cameras, which becomes popular recent years. Our system enables to find where a user is looking at only from an eye image in a 360° surrounding scene image, thus, does not need gaze mapping from partial scene images to a whole scene image that are necessary in conventional eye gaze tracking system. We first generate multiple perspective scene images from an omnidirectional (equirectangular) image and perform registration between the corneal reflection and perspective images using a corneal reflection-scene image registration technique. We then compute the point of gaze using a corneal imaging technique leveraged by a 3D eye model, and project the point to an omnidirectional image. The 3D eye pose is estimate by using the particle-filter-based tracking algorithm. In experiments, we evaluated the accuracy of the 3D eye pose estimation, robustness of registration and accuracy of PoG estimations using two indoor and five outdoor scenes, and found that gaze mapping error was 5.546 [deg] on average.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2017MVP0020/_p
Copy
@ARTICLE{e101-d_5_1278,
author={Taishi OGAWA, Atsushi NAKAZAWA, Toyoaki NISHIDA, },
journal={IEICE TRANSACTIONS on Information},
title={Point of Gaze Estimation Using Corneal Surface Reflection and Omnidirectional Camera Image},
year={2018},
volume={E101-D},
number={5},
pages={1278-1287},
abstract={We present a human point of gaze estimation system using corneal surface reflection and omnidirectional image taken by spherical panorama cameras, which becomes popular recent years. Our system enables to find where a user is looking at only from an eye image in a 360° surrounding scene image, thus, does not need gaze mapping from partial scene images to a whole scene image that are necessary in conventional eye gaze tracking system. We first generate multiple perspective scene images from an omnidirectional (equirectangular) image and perform registration between the corneal reflection and perspective images using a corneal reflection-scene image registration technique. We then compute the point of gaze using a corneal imaging technique leveraged by a 3D eye model, and project the point to an omnidirectional image. The 3D eye pose is estimate by using the particle-filter-based tracking algorithm. In experiments, we evaluated the accuracy of the 3D eye pose estimation, robustness of registration and accuracy of PoG estimations using two indoor and five outdoor scenes, and found that gaze mapping error was 5.546 [deg] on average.},
keywords={},
doi={10.1587/transinf.2017MVP0020},
ISSN={1745-1361},
month={May},}
Copy
TY - JOUR
TI - Point of Gaze Estimation Using Corneal Surface Reflection and Omnidirectional Camera Image
T2 - IEICE TRANSACTIONS on Information
SP - 1278
EP - 1287
AU - Taishi OGAWA
AU - Atsushi NAKAZAWA
AU - Toyoaki NISHIDA
PY - 2018
DO - 10.1587/transinf.2017MVP0020
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E101-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2018
AB - We present a human point of gaze estimation system using corneal surface reflection and omnidirectional image taken by spherical panorama cameras, which becomes popular recent years. Our system enables to find where a user is looking at only from an eye image in a 360° surrounding scene image, thus, does not need gaze mapping from partial scene images to a whole scene image that are necessary in conventional eye gaze tracking system. We first generate multiple perspective scene images from an omnidirectional (equirectangular) image and perform registration between the corneal reflection and perspective images using a corneal reflection-scene image registration technique. We then compute the point of gaze using a corneal imaging technique leveraged by a 3D eye model, and project the point to an omnidirectional image. The 3D eye pose is estimate by using the particle-filter-based tracking algorithm. In experiments, we evaluated the accuracy of the 3D eye pose estimation, robustness of registration and accuracy of PoG estimations using two indoor and five outdoor scenes, and found that gaze mapping error was 5.546 [deg] on average.
ER -