The manner of a person's eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Xia MAO, Zheng LI, "Generating and Describing Affective Eye Behaviors" in IEICE TRANSACTIONS on Information,
vol. E93-D, no. 5, pp. 1282-1290, May 2010, doi: 10.1587/transinf.E93.D.1282.
Abstract: The manner of a person's eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E93.D.1282/_p
Copy
@ARTICLE{e93-d_5_1282,
author={Xia MAO, Zheng LI, },
journal={IEICE TRANSACTIONS on Information},
title={Generating and Describing Affective Eye Behaviors},
year={2010},
volume={E93-D},
number={5},
pages={1282-1290},
abstract={The manner of a person's eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.},
keywords={},
doi={10.1587/transinf.E93.D.1282},
ISSN={1745-1361},
month={May},}
Copy
TY - JOUR
TI - Generating and Describing Affective Eye Behaviors
T2 - IEICE TRANSACTIONS on Information
SP - 1282
EP - 1290
AU - Xia MAO
AU - Zheng LI
PY - 2010
DO - 10.1587/transinf.E93.D.1282
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E93-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2010
AB - The manner of a person's eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.
ER -