Since my last post about our study, the Cognitive Development and Touchscreen Interaction project has gone through several rounds of study recruiting and running. We have recruited our participants from many interested local families with children aging between 4 to 7 years old. In the meantime, I have been working on some high-level analysis of the data that we have collected.
In my last post, I mentioned that we often receive questions regarding the relationship between children’s cognitive development and their touchscreen interaction. Therefore, as a way to unfold and discover if such a relationship exists, we decided to calibrate some tasks through NIH Toolbox to provide another assessment metric beyond age. We employed two tasks: (1) the Dimensional Change Card Sort to evaluate the cognitive workload of our participants when completing repetitive tasks, and (2) the 9-Hole Pegboard Test to assess our participants’ fine motor skills. Completion accuracy and time are taken into consideration and four different types of scores are returned based on the participant’s demographic information. These scores are called the raw score, uncorrected score, age-corrected score, and fully corrected score. Raw score only looks at participant’s performance in terms of the completion time or the accuracy, the uncorrected score converts raw score into a comparable, normally distributed number score. The age and fully corrected score evaluate participant’s performance considering the factor of age, and all the basic demographic information respectively (i.e., education level, ethnicity, race, etc). We will link participants’ cognitive development scores back to their touchscreen interactions once we are able to come to a conclusion with more confidence.
To study the participants’ touchscreen interactions, we used apps like the ones in our lab’s published papers [1,2] to measure the participant’s gesture and target interactions. We also wanted to calculate some of the simple features mentioned in our lab’s previous publication . Simple features of a gesture include the number of strokes of the gesture, the total path length of the gesture, the line similarity, total angle, sharpness of the gesture, and other geometric and temporal measurements. Employing these features and looking at how each gesture is structured may help us understand how the participant’s cognitive development level links to their gesture behaviors.
It has been a little over a year since I joined the lab and started working on this project. At first, I went through some background studies and ramped up at the beginning stage of my research. Now, I am more comfortable and confident in performing user studies than before and can better guide myself through challenges. At this point, we are continuing to receive interest from faculty with children at the University of Florida to participate in our study. Once we have reached our target number of participants, we will perform a more detailed data analysis. Our team is excited to see the outcome of this study.
 Julia Woodward, Alex Shaw, Annie Luc, Brittany Craig, Juthika Das, Phillip Hall, Akshay Hollay, Germaine Irwin, Danielle Sikich, Quincy Brown, and Lisa Anthony. 2016. Characterizing How Interface Complexity Affects Children’s Touchscreen Interactions. Proceedings of the ACM International Conference on Human Factors in Computing Systems (CHI ’16), ACM Press, 1921–1933.
 Lisa Anthony, Quincy Brown, Jaye Nias, Berthel Tate, and Shreya Mohan. 2012. Interaction and Recognition Challenges in Interpreting Children’s Touch and Gesture Input on Mobile Devices. Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, ACM Press, 225–234. http://doi.org/10.1145/2396636.2396671.
 Alex Shaw and Lisa Anthony. 2016. Analyzing the articulation features of children’s touchscreen gestures. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI ’16). Association for Computing Machinery, New York, NY, USA, 333–340. DOI:https://doi.org/10.1145/2993148.2993179.