alexshaw's blog

Paper on Human Recognition of Children's Gestures Accepted to ICMI!

In a previous post, we discussed our ongoing work on studying children's gestures. To get a better idea of the target accuracy for continuing work in gesture recognition, we ran a study comparing human ability to recognize children's gestures to machine recognition. Our paper, "Comparing Human and Machine Recognition of Children's Touchscreen Gestures", quantifies how well children's gestures were recognized by human viewers and by an automated recognition algorithm.

Alex gets a paper accepted to ICMI DC!

In previous posts, we have discussed our ongoing work on improving recognition of children’s touchscreen gestures. My paper, “Human-Centered Recognition of Children’s Touchscreen Gestures”, was accepted to ICMI 2017’s Doctoral Consortium! The paper focused on my future research plans as I continue to work on my doctorate. Here is the abstract:

Update: Understanding Gestures Project

We are currently continuing our work in gesture recognition by studying how well humans can recognize children’s gestures. We will compare human recognition rates to the rates of the automated recognition algorithms we used in our previous work. This will help us get an idea of how well humans are able to recognize children’s gestures. That way, we will have a good target accuracy for our future work on improving automated recognition of children’s gestures.

Understanding Gestures Project: ICMI Paper Accepted!

In a previous post, we discussed our ongoing work on studying children's gestures. We studied a corpus of children's and adults' gestures and analyzed 22 different articulation features, which we are pleased to announce has been accepted for publication at the 2016 ACM International Conference on Multimodal Interaction (ICMI).

Late Breaking Work on Children's Gestures Accepted to CHI!

One of the projects our lab has been working on has been a qualitative analysis of the children's gesture data from our MTAGIC project. We submitted an extended abstract about this work to CHI and it was accepted! In the extended abstract, we detail the tools we have applied thus far to study the children's gestures, and summarize our findings. We also present an outline of our plans for continued work in this area. We will present our work as a poster at the conference.

MTAGIC Update: Gesture Analysis

In our last post we discussed how we were working on replicating analyses from previous studies. We have completed these analyses and written and submitted a paper on our findings. When our paper is accepted, we will post the abstract and announce our findings! Since then, we've begun exploring the data in greater detail by looking at how gesture samples differ among different age groups and how other factors, such as handedness, affect gesture articulation.

$-Family Project Update

In my last post I discussed some work on the $-family of recognizers. Since then, I’ve been working on designing a standalone application for running gesture recognition experiments using the $-family. The application will also allow the user to design custom recognizers. I’ll also be using the $-family to run recognition experiments on the data we have collected in our MTAGIC project.

$-Family of Recognizers Project Update

The $-family of recognizers are lightweight, easy to implement gesture recognizers that allow for quick development of 2-D gesture-based interfaces. These algorithms are short (less than 100 lines of code each) allowing for easy incorporation by developers into new projects. These algorithms currently achieve 98-99% accuracy for recognizing gestures made by adults, but only about 84% accuracy for gestures from kids. Thus, we are working on extending these algorithms so that they can achieve better recognition for children's gestures.

Subscribe to RSS - alexshaw's blog