In face-to-face communication, a large portion of communicative devices rely on the visual modality of bodily behaviors which include facial expression and hand gestures. However through the use of digitally mediated communication which is becoming increasingly prevalent with advances in technology, people are evolving their way to communicate. Texts become shorter and the use of emojis are changing. Facial emojis are symbols for human faces that have become increasingly popular with communicative devices. The original and still most frequent use of emojis is to provide a comment to the text which they follow. However, the latest trend is also to use emojis in the middle of sentences replacing words or adding information to the text. Through the use of EEG and the N400 ERP component, this study investigates which objects emojis refer to via an internet survey and a EEG semantic priming test in which moving emojis in sentences are paired with congruous and incongruous probes. The results of both the survey and the EEG test indicate that there is no preference for particular positions of the emojis and that some of the unusual emojis were ambiguous and did not add to comprehension.
David F Armstrong, William C Stokoe, and Sherman E Wilcox. 1995. Gesture and the nature of language. Cambridge University Press.
Nicholas A Badcock, Petroula Mousikou, Yatin Mahajan, Peter de Lissa, Johnson Thie, and Genevieve McArthur. Proceedings of the 4th European and 7th Nordic Symposium 2013. Validation of the emotiv epoc® eeg gaming system for measuring research quality auditory erps. PeerJ, 1:e38.
Nicholas A Badcock, Kathryn A Preece, Bianca de Wit, Katharine Glenn, Nora Fieder, Johnson Thie, and Genevieve McArthur. 2015. Validation of the emotiv epoc eeg system for research quality auditory eventrelated potentials in children. PeerJ, 3:e907.
Hidenori Boutani and Mieko Ohsuga. 2013. Applicability of the emotiv eeg neuroheadset as a user-friendly input interface. In Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE, pages 1346–1349. IEEE.
Sara C Broaders, Susan Wagner Cook, Zachary Mitchell, and Susan Goldin-Meadow. 2007. Making children gesture brings out implicit knowledge and leads to learning. Journal of Experimental Psychology: General, 136(4):539.
Mingyuan Chu and Sotaro Kita. 2011. The nature of gestures’ beneficial role in spatial problem solving. Journal of Experimental Psychology: General, 140(1):102.
Montserrat Comesa˜na, Ana Paula Soares, Manuel Perea, Ana P Pi˜neiro, Isabel Fraga, and Ana Pinheiro. 2013. Erp correlates of masked affective priming with emoticons. Computers in Human Behavior, 29(3):588–595.
Michael C Corballis. 2002. From hand to mouth: The origins of language. Princeton University Press.
Hiran Ekanayake. 2010. P300 and emotiv epoc: Does emotiv epoc capture real eeg? Web publication http://neurofeedback. visaduma. info/emotivresearch. htm.
David Freedberg and Vittorio Gallese. 2007. Motion, emotion and empathy in esthetic experience. Trends in cognitive sciences, 11(5):197–203.
Susan Goldin-Meadow, Susan Wagner Cook, and Zachary A Mitchell. 2009. Gesturing gives children new ideas about math. Psychological Science, 20(3):267–272.
Susan Goldin-Meadow. 1999. The role of gesture in communication and thinking. Trends in cognitive sciences, 3(11):419–429.
Henning Holle and Thomas C Gunter. 2007. The role of iconic gestures in speech disambiguation: Erp evidence. 19, 19(7):1175–1192.
Marc Jeannerod. 1994. The representing brain: Neural correlates of motor intention and imagery. Behavioral and Brain sciences, 17(02):187–202.
Aleksandra Kawala-Janik, Mariusz Pelc, and Michal Podpora. 2015. Method for eeg signals pattern recognition in embedded systems. Elektronika ir Elektrotechnika, 21(3):3–9.
Spencer D Kelly, Dale J Barr, R Breckinridge Church, and Katheryn Lynch. 1999. Offering a hand to pragmatic understanding: The role of speech and gesture in comprehension and memory. Journal of memory and Language, 40(4):577–592.
Spencer D Kelly, Corinne Kravitz, and Michael Hopkins. 2004. Neural correlates of bimodal speech and gesture comprehension. Brain and language, 89(1):253–260.
Adam Kendon. 2004. Gesture: Visible action as utterance. Cambridge University Press.
Evelyne Kohler, Christian Keysers, M Alessandra Umilta, Leonardo Fogassi, Vittorio Gallese, and Giacomo Rizzolatti. 2002. Hearing sounds, understanding actions: action representation in mirror neurons. Science, 297(5582):846–848.
Marta Kutas and Kara D Federmeier. 2011. Thirty years and counting: Finding meaning in the n400 component of the event related brain potential (erp). Annual review of psychology, 62:621.
Marta Kutas and Steven A Hillyard. 1980. Reading senseless sentences: Brain potentials reflect semantic incongruity. Science, 207(4427):203–205.
Peter F MacNeilage. 1998. The frame/content theory of evolution of speech production. Behavioral and brain sciences, 21(04):499–511.
Louis Mayaud, Marco Congedo, Aur´elien Van Laghenhove, D Orlikowski, M Fig`ere, E Azabou, and F Cheliout- Heraut. 2013. A comparison of recording modalities of p300 event-related potentials (erp) for brain-computer interface (bci) paradigm. Neurophysiologie Clinique/Clinical Neurophysiology, 43(4):217–227.
David McNeill. 1992. Hand and mind: What gestures reveal about thought. University of Chicago Press.
David McNeill. 2005. Gesture and thought. University of Chicago Press.
David E Meyer and Roger W Schvaneveldt. 1971. Facilitation in recognizing pairs of words: evidence of a dependence between retrieval operations. Journal of experimental psychology, 90(2):227.
Hannah Miller, Jacob Thebault-Spieker, Shuo Chang, Isaac Johnson, Loren Terveen, and Brent Hecht. 2016. blissfully happy or ready to fight: Varying interpretations of emoji. ICWSM16.
James H Neely. 1977. Semantic priming and retrieval from lexical memory: Roles of inhibitionless spreading activation and limited-capacity attention. Journal of experimental psychology: general, 106(3):226.
Koen Nelissen, Giuseppe Luppino, Wim Vanduffel, Giacomo Rizzolatti, and Guy A Orban. 2005. Observing others: multiple action representation in the frontal lobe. Science, 310(5746):332–336.
Thomas Ousterhout and Mads Dyrholm. 2013. Cortically coupled computer vision with emotiv headset using distractor variables. In Cognitive Infocommunications (CogInfoCom), 2013 IEEE 4th International Conference on, pages 245–250. IEEE.
Thomas Ousterhout. 2015a. Cross-form facilitation effects from simultaneous gesture/word combinations with erp analysis. In Cognitive Infocommunications (CogInfoCom), 2015 6th IEEE International Conference on, pages 493–497. IEEE.
Thomas Ousterhout. 2015b. N400 congruency effects from emblematic gesture probes following sentence primes. In Intelligent Engineering Systems (INES), 2015 IEEE 19th International Conference on, pages 411–415. IEEE.
Asli Özyürek, Roel M Willems, Shinichi Kita, and Peter Hagoort. 2007. On-line integration of semantic information from speech and gesture: Insights from event-related brain potentials. Cognitive Neuroscience, Journal of, 19(4):605–616.
Frances H Rauscher, Robert M Krauss, and Yihsiu Chen. 1996. Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science, 7(4):226–231.
Giacomo Rizzolatti and Michael A Arbib. 1998. Language within our grasp. Trends in neurosciences, 21(5):188–194.
Giacomo Rizzolatti and Laila Craighero. 2004. The mirror-neuron system. Annu. Rev. Neurosci., 27:169–192.
Giacomo Rizzolatti and Maddalena Fabbri-Destro. 2008. The mirror system and its role in social cognition. Current opinion in neurobiology, 18(2):179–184.
Giacomo Rizzolatti, Luciano Fadiga, Vittorio Gallese, and Leonardo Fogassi. 1996. Premotor cortex and the recognition of motor actions. Cognitive brain research, 3(2):131–141.
Tania Singer, Ben Seymour, John O’Doherty, Holger Kaube, Raymond J Dolan, and Chris D Frith. 2004. Empathy for pain involves the affective but not sensory components of pain. Science, 303(5661):1157–1162.
Maria Alessandra Umilta, Evelyne Kohler, Vittorio Gallese, Leonardo Fogassi, Luciano Fadiga, Christian Keysers, and Giacomo Rizzolatti. 2001. I know what you are doing: A neurophysiological study. Neuron, 31(1):155–165.
Bruno Wicker, Christian Keysers, Jane Plailly, Jean-Pierre Royet, Vittorio Gallese, and Giacomo Rizzolatti. 2003. Both of us disgusted in my insula: the common neural basis of seeing and feeling disgust. Neuron, 40(3):655–664.
Ying Choon Wu and Seana Coulson. 2005. Meaningful gestures: Electrophysiological indices of iconic gesture comprehension. Psychophysiology, 42(6):654–667.
Ying Choon Wu and Seana Coulson. 2007. How iconic gestures enhance communication: An erp study. Brain and language, 101(3):234–245.