Citation:
Soni, N., Darrow, A., Luc, A., Gleaves, S., Schuman, C., Neff, H., Chang, P., Kirkland, B., Alexandre, J., Morales., A., Stofer, K.A., Anthony, L. Affording Embodied Cognition through Touchscreen and Above-the-Surface Gestures During Collaborative Tabletop Science Learning. International Journal of Computer-Supported Collaborative Learning (IJCSCL’21), Volume 16, 2021, p.105–p.144, https://link.springer.com/article/10.1007/s11412-021-09341-x [paper link]
Abstract:
This paper draws upon the theory of embodied cognition to provide a robust account of how gestural interactions with and around multi-touch tabletops can play an important role in facilitating collaborative meaning-making, particularly in the context of science data visualizations. Embodied cognition is a theory of learning that implies that thinking and perception are shaped by interactions with the physical environment. Previous research has used embodied cognition as a theoretical framework to inform the design of large touchscreen learning applications such as for multi-touch tabletops. However, this prior work has primarily assumed that learning is occurring during any motion or interaction, without considering how specific interactions may be linked to particular instances of collaborative learning supported by embodiment. We investigate this question in the context of collaborative learning from data visualizations of global phenomena such as ocean temperatures. We followed a user-centered, iterative design approach to build a tabletop prototype that facilitated collaborative meaning-making and used this prototype as a testbed in a laboratory study with 11 family groups. We qualitatively analyzed learner groups’ co-occurring utterances and gestures to identify the nature of gestural interactions groups used when their utterances signaled the occurrence of embodiment during collaborative meaning-making. Our findings present an analysis of both touchscreen and above-the-surface gestural interactions that were associated with instances of embodied cognition. We identified four types of gestural interactions that promote scientific discussion and collaborative meaning-making through embodied cognition: (T1) gestures for orienting the group; (T2) cooperative gestures for facilitating group meaning-making; (T3) individual intentional gestures for facilitating group meaning-making; and (T4) gestures for articulating conceptual understanding to the group. Our work illustrates interaction design opportunities for affording embodied cognition and will inform the design of future interactive tabletop experiences in the domain of science learning.
Files: