Excitement or arousal is one of the main emotional dimensions that affects our lives on a daily basis. We win a tennis match, watch a great movie, get into an argument with a colleague—all of these are instances when most of us experience excitement, yet we do not pay much attention to it. Today, there are few systems that capture our excitement levels and even fewer that actually promote awareness of our most exciting moments. In this paper, we propose a visualization concept for representing individual and group-level excitement for emotional self-awareness and group-level awareness. The data used for the visualization is obtained from smart wristbands worn by each of the users. The visualization uses animated glyphs to generate a real-time representation for each individual’s excitement levels. We introduce two types of encodings for these glyphs: one focusing on capturing both the current excitement and the excitement history, as well as another focusing only on real-time values and previous peaks. The excitement levels are computed based on measurements of the user’s galvanic skin response and accelerometer data from the wristbands, allowing for a classification of the excitement levels into experienced (excitement without physical manifestation) and manifested excitement. A dynamic clustering of the individual glyphs supports the scalability of our visualization, while at the same time offering an overview of the group-level excitement and its distribution. The results of a preliminary evaluation suggest that the visualization allows users to intuitively and accurately perceive both individual and group-level excitement.
Keywords: Excitement visualization; emotion visualization; group excitement; personal visualization; galvanic skin response.
Proceedings of EmoVis 2016, ACM IUI 2016 Workshop on Emotion and Visualization, Sonoma, CA, USA, March 10, 2016
1. Aigner, W., Miksch, S., Schumann, H., and Tominski, C. Visualization of time-oriented data. Springer Science& Business Media, 2011.
2. Alsakran, J., Chen, Y., Zhao, Y., Yang, J., and Luo, D. STREAMIT: Dynamic visualization and interactive exploration of text streams. In Proceedings of the IEEE Pacific Visualization Symposium (PacificVis), IEEE (2011), 131–138.
3. Bakker, J., Pechenizkiy, M., and Sidorova, N. What’s your current stress level? Detection of stress patterns from GSR sensor data. In Proceedings of the 2011 IEEE 11th International Conference on Data Mining Workshops (ICDMW ’11), IEEE (2011), 573–580.
4. Borg, I., and Groenen, P. J. F. Modern multidimensional scaling: Theory and applications. Springer Science& Business Media, 2005.
5. Cernea, D., Ebert, A., and Kerren, A. Visualizing group affective tone in collaborative scenarios. In Poster Abstracts of the Eurographics Conference on Visualization (EuroVis ’14) (2014).
6. Cernea, D., Weber, C., Ebert, A., and Kerren, A. Emotion Scents – a method of representing user emotions on GUI widgets. In Proceedings of the SPIE 2013 Conference on Visualization and Data Analysis (VDA ’13), vol. 8654, IS&T/SPIE (2013).
7. Cernea, D., Weber, C., Ebert, A., and Kerren, A. Emotion-prints: Interaction-driven emotion visualization on multi-touch interfaces. In Proceedings of the SPIE 2015 Conference on Visualization and Data Analysis (VDA ’15), vol. 9397, IS&T/SPIE (2015).
8. Charlton, B. Evolution and the cognitive neuroscience of awareness, consciousness and language. In Psychiatry and the Human Condition. Radcliffe Medical Press, 2000.
9. ColorBrewer 2.0—color advice for cartography. http://colorbrewer2.org/. Accessed January 28, 2016.
10. Cottam, J. A., Lumsdaine, A., and Weaver, C. Watch this: A taxonomy for dynamic data visualization. In Proceedings of the IEEE Conference on Visual Analytics Science and Technology (VAST) (2012), 193–202.
11. D3—data-driven documents. http://d3js.org/. Accessed January 28, 2016. 12. Druskat, V. U., and Wolff, S. B. Building the emotional intelligence of groups. Harvard Business Review 79, 3 (2001), 80–90.
13. Empatica E4 Wristband. https://www.empatica.com/e4-wristband. Accessed January 28, 2016.
14. Fruchterman, T. M. J., and Reingold, E. M. Graph drawing by force-directed placement. Software: Practice and Experience 21, 11 (1991), 1129–1164.
15. Garc´ia, O., Favela, J., Licea, G., and Machorro, R. Extending a collaborative architecture to support emotional awareness. In Proceedings of the Workshop on Emotion Based Agent Architectures (EBAA ’99) (1999), 46–52.
16. Garc´ia, O., Favela, J., and Machorro, R. Emotional awareness in collaborative systems. In Proceedings of the International Workshop on Groupware (CRIWG ’99), IEEE (1999), 296–303.
17. Healey, J. A. Affect detection in the real world: Recording and processing physiological signals. In Proceedings of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops (ACII) (2009).
18. Huang, D., Tory, M., Aseniero, B. A., Bartram, L., Bateman, S., Carpendale, S., Tang, A., and Woodbury, R. Personal visualization and personal visual analytics. IEEE Transactions on Visualization and Computer Graphics 21, 3 (2015), 420–433.
19. Huisman, G., van Hout, M., van Dijk, E., van der Geest, T., and Heylen, D. LEMtool: measuring emotions in visual interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’13), ACM (2013), 351–360.
20. Huron, S., Vuillemot, R., and Fekete, J.-D. Visual sedimentation. IEEE Transactions on Visualization and Computer Graphics 19, 12 (2013), 2446–2455.
21. Kapp, K. M. The Gamification of Learning and Instruction: Game-based Methods and Strategies for Training and Education. Pfeiffer, 2012.
22. Liu, Y., Sourina, O., and Nguyen, M. K. Real-time EEG-based human emotion recognition and visualization. In Proceedings of the International Conference on Cyberworlds (CW), IEEE (2010), 262–269.
23. McDuff, D., Karlson, A., Kapoor, A., Roseway, A., and Czerwinski, M. AffectAura: An intelligent system for emotional memory. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’12), ACM (2012), 849–858.
24. Mehrabian, A. Basic Dimensions for a General Psychological Theory: Implications for Personality, Social, Environmental, and Developmental Studies. Social Environmental and Developmental Studies. Oelgeschlager, Gunn & Hain, 1980.
25. Microsoft Band. https://www.microsoft.com/microsoft-band/en-us. Accessed January 28, 2016.
26. Nakasone, A., Prendinger, H., and Ishizuka, M. Emotion recognition from electromyography and skin conductance. In Proceedings of the 5th International Workshop on Biosignal Interpretation (BSI-05) (2005), 219–222.
27. Poh, M.-Z., Swenson, N. C., and Picard, R. W. A wearable sensor for unobtrusive, long-term assessment of electrodermal activity. IEEE Transactions on Biomedical Engineering 57, 5 (2010), 1243–1252.
28. Russell, J. A. A circumplex model of affect. Journal of Personality and Social Psychology 39, 6 (1980), 1161.
29. Saari, T., Kallinen, K., Salminen, M., Ravaja, N., and Yanev, K. A mobile system and application for facilitating emotional awareness in knowledge work teams. In Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS ’08) (2008), 44–53.
30. Shi, Y., Ruiz, N., Taib, R., Choi, E., and Chen, F. Galvanic skin response (GSR) as an index of cognitive load. In CHI ’07 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’07), ACM (2007), 2651–2656.
31. Valenza, G., and Scilingo, E. P. Autonomic Nervous System Dynamics for Mood and Emotional-State Recognition. Springer, 2014.
32. Villarejo, M. V., Zapirain, B. G., and Zorrilla, A. M. A stress sensor based on galvanic skin response (GSR) controlled by ZigBee. Sensors 12, 5 (2012), 6075–6101.
33. Wang, S., Tanahashi, Y., Leaf, N., and Ma, K.-L. Design and effects of personal visualizations. IEEE Computer Graphics and Applications 35, 4 (2015), 82–93.
34. Ward, M. O. A taxonomy of glyph placement strategies for multidimensional data visualization. Information Visualization 1, 3-4 (2002), 194–210.
35. Westerink, J. H. D. M., van den Broek, E. L., Schut, M. H., van Herk, J., and Tuinenbreijer, K. Computing emotion awareness through galvanic skin response and facial electromyography. In Probing Experience, J. H. D. M. Westerink, M. Ouwerkerk, T. J. M. Overbeek, W. F. Pasveer, and B. de Ruyter, Eds., vol. 8 of Philips Research. Springer, 2008, 149–162.
36. Wong, P. C., Foote, H., Adams, D., Cowley, W., and Thomas, J. Dynamic visualization of transient data streams. In Proceedings of the IEEE Symposium on Information Visualization (INFOVIS ’03), IEEE (2003), 97–104.