A Spontaneous Cross-Cultural Emotion Database: Latin-America vs. Japan

Maria Alejandra Quiros-Ramirez
Graduate School of System and Information Engineering, University of Tsukuba, Japan

Senya Polikovsky
Graduate School of System and Information Engineering, University of Tsukuba, Japan

Yoshinari Kameda
Graduate School of System and Information Engineering, University of Tsukuba, Japan

Takehisa Onisawa
Graduate School of System and Information Engineering, University of Tsukuba, Japan

Ladda ner artikel

Ingår i: KEER2014. Proceedings of the 5th Kanesi Engineering and Emotion Research; International Conference; Linköping; Sweden; June 11-13

Linköping Electronic Conference Proceedings 100:94, s. 1127-1134

Visa mer +

Publicerad: 2014-06-11

ISBN: 978-91-7519-276-5

ISSN: 1650-3686 (tryckt), 1650-3740 (online)


In this paper; we present a new database to support the cross-cultural studies. Two cultural groups are selected: Latin America and Japan; to represent western and oriental cultures. Emotions are elicited through an experiment in which participants observe emotionally loaded stimuli and then rate their feelings in a valence (how positive or negative is the experienced emotion) and arousal (how intense is this emotion) scale. The interactions are recorded using audiovisual and thermal devices. This database features three innovative characteristics: spontaneous emotion expressions; multiple synchronized sources of interaction; cross-cultural comparison support. This set of characteristics is missing in the currently available emotion databases; making our database a unique open option for studying spontaneous expressiveness of emotions in a cross-cultural context.


Cultural specificity; universality; multimodal corpus; affect


Adams Jr; R. B.; Rule; N. O.; Franklin Jr; R. G.; Wang; E.; Stevenson; M. T.; Yoshikawa; S.; ... & Ambady; N. (2010). Cross-cultural reading the mind in the eyes: An fMRI investigation. Journal of Cognitive Neuroscience; 22(1); 97-108.

Beale; R.; Peter; C. (2008). The Role of Affect and Emotion in HCI. In Affect and Emotion in Human-Computer Interaction (pp. 1-11). Berlin: Springer.

Caridakis; G.; Wagner; J.; Raouzaiou; A.; Lingenfelser; F.; Karpouzis; K.; & Andre; E. (2012) A cross-cultural; multimodal; affective corpus for gesture expressivity analysis. Journal on Multimodal User Interfaces; 1-14.

Chiao; J. Y.; Iidaka; T.; Gordon; H. L.; Nogawa; J.; Bar; M.; Aminoff; E.; ... & Ambady; N. (2008). Cultural specificity in amygdala response to fear faces. Journal of Cognitive Neuroscience; 20(12); 2167-2174.

Dan-Glauser; E. S.; & Scherer; K. R. (2011). The Geneva affective picture database (GAPED): a new 730-picture database focusing on valence and normative significance. Behavior research methods; 43(2); 468-477.

Ekman; P.; & Friesen; W. V. (1971). Constants across cultures in the face and emotion. Journal of personality and social psychology; 17(2); 124.

Ekman; P.; & Keltner; D. (1997). Universal facial expressions of emotion: an old controversy and new findings. Nonverbal communication: where nature meets culture; 27-46.

Jack; R. E.; Garrod; O. G.; Yu; H.; Caldara; R.; & Schyns; P. G. (2012). Facial expressions of emotion are not culturally universal. Proceedings of the National Academy of Sciences; 109(19); 7241-7244.

Kashima; Y. (2000); Conceptions of culture and person for psychology. Journal of Cross-cultural Psychology; 31:1; 14-32.

Gratch; J.; Marsella; S.; & Petta; P. (2009). Modeling the cognitive antecedents and consequences of emotion. Cognitive Systems Research; 10(1); 1-5.

Gunes; H.; Schuller; B.; Pantic; M.; & Cowie; R. (2011). Emotion representation; analysis and synthesis in continuous space: A survey. In Automatic Face & Gesture Recognition and Workshops (FG 2011); 2011 IEEE International Conference on (pp. 827-834). IEEE.

Hewig; J.; Hagemann; D.; Seifert; J.; Gollwitzer; M.; Naumann; E.; & Bartussek; D. (2005). A revised film set for the induction of basic emotions. Cognition and Emotion.

Hoque; M.; & Picard; R. W. (2011). Acted vs. natural frustration and delight: Many people smile in natural frustration. In Automatic Face & Gesture Recognition and Workshops (FG 2011); 2011 IEEE International Conference on (pp. 354-359). IEEE.

Izard; C. E. (1994). Innate and universal facial expressions: evidence from developmental and cross-cultural research. Psychological Bulletin; 115(2); 288-299.

Kamaruddin; N.; Wahab; A.; & Quek; C. (2012). Cultural dependency analysis for understanding speech emotion. Expert Systems with Applications; 39(5); 5115-5133.

Makatchev; M.; Simmons; R.; & Sakr; M. (2012). A Cross-cultural Corpus of Annotated Verbal and Nonverbal Behaviors in Receptionist Encounters. arXiv preprint arXiv:1203.2299.

Mehrabian; A. (1971). Silent Messages. Belmont; CA: Wadsworth.

Quiros-Ramirez; M. A.; Polikovsky; S.; Kameda; Y.; & Onisawa; T. (2012). Towards developing robust multimodal databases for emotion analysis. In Soft Computing and Intelligent Systems (SCIS) and 13th International Symposium on Advanced Intelligent Systems (ISIS); 2012 Joint 6th International Conference on (pp. 589-594). IEEE.

Quiros-Ramirez; M. A.; & Onisawa; T. (2013) Considering cross-cultural context in the automatic recognition of emotions. International Journal of Machine Learning and Cybernetics; 1-9. DOI 10.1007/s13042-013-0192-2;

Saragih; J. (2012). Non-Rigid Face Tracking. Mastering OpenCV with Practical Computer Vision Projects; 189-233.

Scherer; K. R.; Clark-Polner; E.; & Mortillaro; M. (2011). In the eye of the beholder? Universality and cultural specificity in the expression and perception of emotion. International Journal of Psychology; 46(6); 401-435.

Owens; M. (2006). The definitive guide to SQLite. Apress.

Wilting; J.; Krahmer; E.; & Swerts; M. (2006). Real vs. acted emotional speech. In INTERSPEECH.

Citeringar i Crossref