In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Dadet PRAMADIHANTO, Yoshio IWAI, Masahiko YACHIDA, "Integrated Person Identification and Expression Recognition from Facial Images" in IEICE TRANSACTIONS on Information,
vol. E84-D, no. 7, pp. 856-866, July 2001, doi: .
Abstract: In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.
URL: https://global.ieice.org/en_transactions/information/10.1587/e84-d_7_856/_p
Copy
@ARTICLE{e84-d_7_856,
author={Dadet PRAMADIHANTO, Yoshio IWAI, Masahiko YACHIDA, },
journal={IEICE TRANSACTIONS on Information},
title={Integrated Person Identification and Expression Recognition from Facial Images},
year={2001},
volume={E84-D},
number={7},
pages={856-866},
abstract={In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.},
keywords={},
doi={},
ISSN={},
month={July},}
Copy
TY - JOUR
TI - Integrated Person Identification and Expression Recognition from Facial Images
T2 - IEICE TRANSACTIONS on Information
SP - 856
EP - 866
AU - Dadet PRAMADIHANTO
AU - Yoshio IWAI
AU - Masahiko YACHIDA
PY - 2001
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E84-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2001
AB - In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.
ER -