Emotion is a dynamic variable that modulates how we perceive, reason about, and interact with our environment. Recent studies have established that emotion’s influence carries to data analysis and visualization, impacting performance in ways both positive and negative. While we are still in the infancy of understanding the role emotion plays in analytical contexts, advances in physiological sensing and emotion research have raised the possibility of creating emotion-aware systems. In this position paper, we argue that it is critical to consider the potential advances that can be made even in the face of imperfect sensing, while we continue to address the practical challenges of monitoring emotion in the wild. To underscore the importance of this line of inquiry, we highlight several key challenges related to detection, adaptation, and impact of emotional states for users of data visualization systems, and motivate promising avenues for future research in these areas.
Keywords: Emotion, affect, visualization, adaptation, theory.
Proceedings of EmoVis 2016, ACM IUI 2016 Workshop on Emotion and Visualization, Sonoma, CA, USA, March 10, 2016
1. Scott Bateman, Regan L Mandryk, Carl Gutwin, Aaron Genest, David McDine, and Christopher Brooks. 2010. Useful junk?: the effects of visual embellishment on comprehension and memorability of charts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2573–2582.
2. Nick Cawthon and Andrew Vande Moere. 2007. The effect of aesthetic on the usability of data visualization. In Information Visualization, 2007. IV’07. 11th International Conference. IEEE, 637–648.
3. Daniel Cernea, Andreas Kerren, and Achim Ebert. 2011. Detecting insight and emotion in visualization applications with a commercial EEG headset. In SIGRAD 2011 Conference on Evaluations of Graphics and Visualization-Efficiency, Usefulness, Accessibility, Usability. 53–60.
4. Antonio Damasio. 2008. Descartes’ error: Emotion, reason and the human brain. Random House. 5. Stephen Fairclough. 2015. We Need To Talk About Clippy. (2015). http://physiologicalcomputing.org/2015/03/we-need-to-talk-about-clippy/
6. Stephen H Fairclough. 2009. Fundamentals of physiological computing. Interacting with computers 21, 1 (2009), 133–145.
7. Lane Harrison, Drew Skau, Steven Franconeri, Aidong Lu, and Remco Chang. 2013. Influencing visual judgment through affective priming. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2949–2958.
8. Danilo P Mandic, Dragan Obradovic, Anthony Kuh, Tülay Adali, Udo Trutschell, Martin Golz, Philippe De Wilde, Javier Barria, Anthony Constantinides, and Jonathon Chambers. 2005. Data fusion for modern engineering applications: An overview. In Artificial Neural Networks: Formal Models and Their Applications–ICANN 2005. Springer, 715–721.
9. Evan M Peck, Daniel Afergan, and Robert JK Jacob. 2013. Investigation of fNIRS brain sensing as input to information filtering systems. In Proceedings of the 4th Augmented Human International Conference. ACM, 142–149.
10. Albrecht Schmidt. 2000. Implicit human computer interaction through context. Personal Technologies 4, 2-3 (2000), 191–199.
11. Erin Treacy Solovey, Daniel Afergan, Evan M Peck, Samuel W Hincks, and Robert JK Jacob. 2015. Designing implicit interfaces for physiological computing: Guidelines and lessons learned using fNIRS. ACM Transactions on Computer-Human Interaction (TOCHI) 21, 6 (2015), 35.
12. Thorsten O Zander, Christian Kothe, Sabine Jatzev, and Matti Gaertner. Brain-Computer Interfaces. In Brain-Computer Interfaces (Human-Computer
Interaction Series). Springer London, 181–199.