Music is often used for emotion induction because it can change the emotions of people. However, since we subjectively feel different emotions when listening to music, we propose an emotion induction system that generates music that is adapted to each individual. Our system automatically generates suitable music for emotion induction based on the emotions predicted from an electroencephalogram (EEG). We examined three elements for constructing our system: 1) a music generator that creates music that induces emotions that resemble the inputs, 2) emotion prediction using EEG in real-time, and 3) the control of a music generator using the predicted emotions for making music that is suitable for inducing emotions. We constructed our proposed system using these elements and evaluated it. The results showed its effectiveness for inducing emotions and suggest that feedback loops that tailor stimuli to individuals can successfully induce emotions.
Kana MIYAMOTO
Nara Institute of Science and Technology,RIKEN, Center for Advanced Intelligence Project AIP
Hiroki TANAKA
Nara Institute of Science and Technology,RIKEN, Center for Advanced Intelligence Project AIP
Satoshi NAKAMURA
Nara Institute of Science and Technology,RIKEN, Center for Advanced Intelligence Project AIP
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Kana MIYAMOTO, Hiroki TANAKA, Satoshi NAKAMURA, "Online EEG-Based Emotion Prediction and Music Generation for Inducing Affective States" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 5, pp. 1050-1063, May 2022, doi: 10.1587/transinf.2021EDP7171.
Abstract: Music is often used for emotion induction because it can change the emotions of people. However, since we subjectively feel different emotions when listening to music, we propose an emotion induction system that generates music that is adapted to each individual. Our system automatically generates suitable music for emotion induction based on the emotions predicted from an electroencephalogram (EEG). We examined three elements for constructing our system: 1) a music generator that creates music that induces emotions that resemble the inputs, 2) emotion prediction using EEG in real-time, and 3) the control of a music generator using the predicted emotions for making music that is suitable for inducing emotions. We constructed our proposed system using these elements and evaluated it. The results showed its effectiveness for inducing emotions and suggest that feedback loops that tailor stimuli to individuals can successfully induce emotions.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021EDP7171/_p
Copy
@ARTICLE{e105-d_5_1050,
author={Kana MIYAMOTO, Hiroki TANAKA, Satoshi NAKAMURA, },
journal={IEICE TRANSACTIONS on Information},
title={Online EEG-Based Emotion Prediction and Music Generation for Inducing Affective States},
year={2022},
volume={E105-D},
number={5},
pages={1050-1063},
abstract={Music is often used for emotion induction because it can change the emotions of people. However, since we subjectively feel different emotions when listening to music, we propose an emotion induction system that generates music that is adapted to each individual. Our system automatically generates suitable music for emotion induction based on the emotions predicted from an electroencephalogram (EEG). We examined three elements for constructing our system: 1) a music generator that creates music that induces emotions that resemble the inputs, 2) emotion prediction using EEG in real-time, and 3) the control of a music generator using the predicted emotions for making music that is suitable for inducing emotions. We constructed our proposed system using these elements and evaluated it. The results showed its effectiveness for inducing emotions and suggest that feedback loops that tailor stimuli to individuals can successfully induce emotions.},
keywords={},
doi={10.1587/transinf.2021EDP7171},
ISSN={1745-1361},
month={May},}
Copy
TY - JOUR
TI - Online EEG-Based Emotion Prediction and Music Generation for Inducing Affective States
T2 - IEICE TRANSACTIONS on Information
SP - 1050
EP - 1063
AU - Kana MIYAMOTO
AU - Hiroki TANAKA
AU - Satoshi NAKAMURA
PY - 2022
DO - 10.1587/transinf.2021EDP7171
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2022
AB - Music is often used for emotion induction because it can change the emotions of people. However, since we subjectively feel different emotions when listening to music, we propose an emotion induction system that generates music that is adapted to each individual. Our system automatically generates suitable music for emotion induction based on the emotions predicted from an electroencephalogram (EEG). We examined three elements for constructing our system: 1) a music generator that creates music that induces emotions that resemble the inputs, 2) emotion prediction using EEG in real-time, and 3) the control of a music generator using the predicted emotions for making music that is suitable for inducing emotions. We constructed our proposed system using these elements and evaluated it. The results showed its effectiveness for inducing emotions and suggest that feedback loops that tailor stimuli to individuals can successfully induce emotions.
ER -