Automatic Recognition of Children’s Touchscreen Stroke Gestures

Automatic Recognition of Children’s Touchscreen Stroke Gestures


Shaw, A. 2020. Automatic Recognition of Children’s Touchscreen Stroke Gestures. Ph.D. thesis, Department of Computer and Information Science and Engineering, University of Florida. March 2020. [PDF]


“Children are increasingly using touchscreen applications in many different contexts. Smartphones, tablets, touchscreen computers, and other touchscreen devices have become commonplace, and many applications developed on these devices are designed for children. A common form of interaction in applications is gesture input, but little work has analyzed how children make gestures and how well automatic recognition algorithms are able to recognize them. In this dissertation, we begin by analyzing the ability of existing recognition algorithms to recognize children’s touchscreen gestures.

Our findings show that recognition algorithms uniformly perform poorly on recognizing children’s gestures. As a benchmark for future work analyzing recognition accuracy, we examined human ability to recognize children’s gestures and found human accuracy was significantly better than machine accuracy, indicating potential for improvement in future recognition algorithms. To better understand why children’s gestures were recognized more poorly than adults’ gestures, we then analyzed the gestures through the lens of articulation features.

We first analyzed children’s gestures based on a set of 22 articulation features that had been developed to characterize adults’ gestures. We found a significant effect of age on the values of 18 of the 22 features. Our results showed that children were highly inconsistent in their gesturing patterns when compared to adults.

We noticed in our work on articulation features that the features were designed with well-formed gestures from adults in mind. However, children’s gestures are often not well-formed, so these features do not capture some of the common articulation mistakes we see in children. Thus, we developed a set of six new articulation features specifically designed to capture common patterns we observed in a set of children’s gestures.

Based on our findings in our studies of recognition rates and articulation features, we offer a set of guidelines for developers creating gesture-based applications for children. Finally, we lay out several potential avenues of future work based on the findings of our recognition and feature studies.”




This work is partially supported by National Science Foundation Grant Awards #IIS1218395 / IIS1433228 and IIS1552598. Opinions, findings, and conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect these agencies’ views.