Analyzing Kansei from Facial Expressions by CSRBF Mapping

Luis Diago
InterLocus Inc, Japan/Meiji University, Japan

Julian Romero
InterLocus Inc, Japan/Meiji University, Japan

Junichi Shinoda
InterLocus Inc, Japan

Ichiro Hagiwara
Meiji University, Japan

Ladda ner artikel

Ingår i: KEER2014. Proceedings of the 5th Kanesi Engineering and Emotion Research; International Conference; Linköping; Sweden; June 11-13

Linköping Electronic Conference Proceedings 100:73, s. 877-885

Visa mer +

Publicerad: 2014-06-11

ISBN: 978-91-7519-276-5

ISSN: 1650-3686 (tryckt), 1650-3740 (online)


This paper describes an application where a new Kansei/Affective Engineering (KAE) system was applied to define the properties of the facial images perceived as Iyashi. Iyashi is a Japanese word used to describe a peculiar phenomenon that is mentally soothing; but is yet to be clearly defined. Instead of analyzing facial expressions of an individual to determine his emotional state; the proposed system introduces a fuzzy-quantized holographic neural network (FQHNN) to find the rules involved in the Kansei evaluation provided by the subjects about the limited dataset of 20 facial images.

In order to validate and gain a clear insight into the rules involved in the Kansei evaluation process; Procrustes analysis and Compactly-Supported Radial Basis Functions (CSRBF) are combined to generate new facial images. Procrustes analysis is used to find the minimal dissimilarity measure between two facial images with opposite classification (i.e. Iyashi and Non-Iyashi). CSRBFs are proposed for tuning of 17 facial parameters and mapping between facial images within opposite classes. The experiments with two subjects demonstrate that if only two from the five most important parameters of the face are changed then the Kansei evaluation can change to the opposite class. This paper shows that a continuous and efficient tuning of the design space can be achieved by introducing CSRBF mapping into the new KAE system


Kansei evaluation; Iyashi expressions; neuro-fuzzy classifiers; radial basis functions


T. Kitaoka; L. A. Diago; I. Hagiwara; S. Kitazaki and S. Yamane; “Definition; Detection and Generation of Iyashi Expressions”; Journal of Computational Science and Technology; Vol. 2; No. 4; 2008; pp.413-422.

L. A. Diago; T. Kitaoka; I. Hagiwara and T. Kambayashi; “Neuro-Fuzzy Quantification of Personal Perceptions of Facial Images Based on a Limited Data Set”; IEEE Transactions on Neural Networks; Vol.22; No.12; 2011; pp. 2422-2434.

R. L. Sandhusen; Consumer Behavior; ser. Chapter 9. Educational Series. Inc; 2000; vol. 3; pp. 217–243.

G. Fitzsimons; J. Hutchinson; P. Williams; J. Alba; T.L.Chartrand; J. Huber; F. Kardes; G. Menon; P. Raghubir; J. Russo; B. Shiv; and N. Tavassoli; “Non-conscious influences on consumer choice;” Marketing Letters; vol. 13; no. 3; pp. 269–279; 2002.

M. Nagamachi 2011; Kansei/Affective Engineering; CRC Press; 2011; 334 pages.

M. Bradley and P. Lang; “Measuring emotion: The self-assessment manikin and the semantic differential;” J. Behav.Ther.& Psychiat.; vol. 25; no. 1; pp. 49–59; 1994.

P. Lang; M. Bradley; and B. Cuthbert; “International affective picture system (IAPS): Affective ratings of pictures and instruction manual;” University of Florida; Gainesville; FL; Technical Report A-8; 2008.

J. Mikels; B. Fredrickson; G.R.Larkin; C. Lindberg; S. Maglio; and P. Reuter-Lorenz; “Emotional category data on images from the inter- national affective picture system;” Behavior Research Methods; vol. 36; no. 4; pp. 626 –630; 2005.

T. Matsui; “Institutionalization of consumer needs: The case of the ”healing boom” in japan;” Center for Japanese Business Studies (HJBS); Hitotsubashi Univer- sity; Working Paper Series 071; Jan. 2008.

Russell and L. Barrett; “Core affect; prototypical emotional episodes; and other things called emotion: Dissecting the elephant;” Journal of Personality and Social Psychology; vol. 76; no. 5; pp. 805–819; 1999.

P. Ekman and W. V. Friesen; Facial Action Coding System: A technique for the measurement of facial movement. Palo Alto; Calif.: Consulting Psychologists Press; 1978. K. R. Scherer; What does facial expression express? 1992; vol. 2; pp. 139–165

M. Pantic and M. S. Bartlett; Machine Analysis of Facial Expressions. I-Tech; Viena; Austria; 2007; ch. 20; pp. 377–416.

B.Fasel and J.Luettin;“Automatic facial expression analysis: A survey;” Pattern Recognition; vol. 36; no. 1; pp. 259–275; 2003.

Y. Tian; T. Kanade; and J. Cohn; Facial expression analysis. Springer New York; 2005; ch. 11.

E. Benitez; A. Hernandez; J. Reyes; A. Marasinghe. Kansei Engineering in Gigakuman Character Design. 2013 International Conference on Biometrics and Kansei Engineering (ICBAKE); pp 293 - 296

T. Ohira; T. Nakamura; M. Kanoh; T. Kunitachi; H. Itoh; “A Rough Set Approach to Extract Painting Composition Rules. IEEE Int. Conference on Fuzzy Systems; 2009. FUZZ-IEEE 2009.pp. 1574 - 1578.

M. Okuhara; S. Goto; A. Higo; T. Aoto; “Relationship between Kawaii Feeling and Biological Signals”; Transactions of Japan Society of Kansei Engineering 10(2); 109-114; 2011.

H. Wendland; Piecewise polynomial; positive defined and compactly supported radial functions of minimal degree; AICM; 4; 389-396; 1995.

M. Ando & M. Hagiwara “3D Character Creation System Using Kansei Rule with the Fitness Extraction Method. IEEE Int. Conference on Fuzzy Systems; 2009. FUZZ-IEEE 2009.pp. 1507 - 1512.

Citeringar i Crossref