This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.
Qun SHI
Nara Institute of Science and Technology
Norimichi UKITA
Nara Institute of Science and Technology
Ming-Hsuan YANG
University of California
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Qun SHI, Norimichi UKITA, Ming-Hsuan YANG, "Natural Facial and Head Behavior Recognition using Dictionary of Motion Primitives" in IEICE TRANSACTIONS on Information,
vol. E100-D, no. 12, pp. 2993-3000, December 2017, doi: 10.1587/transinf.2017EDP7128.
Abstract: This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2017EDP7128/_p
Copy
@ARTICLE{e100-d_12_2993,
author={Qun SHI, Norimichi UKITA, Ming-Hsuan YANG, },
journal={IEICE TRANSACTIONS on Information},
title={Natural Facial and Head Behavior Recognition using Dictionary of Motion Primitives},
year={2017},
volume={E100-D},
number={12},
pages={2993-3000},
abstract={This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.},
keywords={},
doi={10.1587/transinf.2017EDP7128},
ISSN={1745-1361},
month={December},}
Copy
TY - JOUR
TI - Natural Facial and Head Behavior Recognition using Dictionary of Motion Primitives
T2 - IEICE TRANSACTIONS on Information
SP - 2993
EP - 3000
AU - Qun SHI
AU - Norimichi UKITA
AU - Ming-Hsuan YANG
PY - 2017
DO - 10.1587/transinf.2017EDP7128
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E100-D
IS - 12
JA - IEICE TRANSACTIONS on Information
Y1 - December 2017
AB - This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.
ER -