Analysis of Touchscreen Interactive Gestures During Embodied Cognition in Collaborative Tabletop Science Learning Experiences

Analysis of Touchscreen Interactive Gestures During Embodied Cognition in Collaborative Tabletop Science Learning Experiences

Citation:

Soni, N., Darrow, A., Luc, A., Gleaves, S., Schuman, C., Neff, H., Chang, P., Kirkland, B., Alexandre, J., Morales, A., Stofer, K.A., and Anthony, L. 2019. Analysis of Touchscreen Interactive Gestures During Embodied Cognition in Collaborative Tabletop Science Learning Experiences. Proceedings of the International Conference on Computer Supported Collaborative Learning (CSCL’ 2019), Volume 1, Lyon, France, June 17-21, Pages 9-16. [pdf]

Abstract:

“Previous work has used embodied cognition as a theoretical framework to inform the design of large touchscreen interfaces for learning. We seek to understand how specific gestural interactions may be tied to particular instances of learning supported by embodiment. To help us investigate this question, we built a tabletop prototype that facilitates collaborative science learning from data visualizations and used this prototype as a testbed in a laboratory study with 11 family groups. We present an analysis of the types of gestural interactions that accompanied embodied cognition (as revealed by users’ language) while learners interacted with our prototype. Our preliminary findings indicate a positive role of cooperative (multiuser) gestures in supporting scientific discussion and collaborative meaning-making during embodied cognition. Our next steps are to continue our analysis to identify additional touchscreen interaction design guidelines for learning technologies, so that designers can capitalize on the affordances of embodied cognition in these contexts.”

File attachments:

Soni-et-al-CSCL2019