Category: Understanding Gestures
Understanding Gestures Project: Creating an annotation tool to calculate new features
February 27, 2019Over the past months, I have continued my work on the understanding gestures project by working on developing a set of new articulation features based on how children make touchscreen gestures. Our prior work has shown that children’s gestures are not recognized as well as adults’ gestures, which led us to perform further investigation on […]
Read more: Understanding Gestures Project: Creating an annotation tool to calculate new features »Understanding Gestures Project: Implementing the $-family gesture recognizers
August 17, 2018This summer, I have been working on a project related to the $-family of gesture recognizers. The $-family is a series of simple, fast, and accurate gesture recognizers designed to be accessible to novice programmers. $1 [1] was created by Wobbrock and colleagues, and INIT Lab director Lisa Anthony contributed to later algorithms, including $N […]
Read more: Understanding Gestures Project: Implementing the $-family gesture recognizers »INIT Lab PhD Student Alex Shaw wins Best Student Paper at ICMI 2017!
November 27, 2017In our last post, we shared that we had a paper accepted to the ACM International Conference on Multimodal Interaction (ICMI) 2017, to be held in Glasgow, Scotland, UK. The paper was titled “Comparing Human and Machine Recognition of Children’s Touchscreen Gestures.” We just came back from the conference and are proud to announce that […]
Read more: INIT Lab PhD Student Alex Shaw wins Best Student Paper at ICMI 2017! »Understanding Gestures Project: Paper on human recognition of children’s gestures accepted to ICMI!
October 30, 2017In a previous post, we discussed our ongoing work on studying children’s gestures. To get a better idea of the target accuracy for continuing work in gesture recognition, we ran a study comparing human ability to recognize children’s gestures to machine recognition. Our paper, “Comparing Human and Machine Recognition of Children’s Touchscreen Gestures”, quantifies how […]
Read more: Understanding Gestures Project: Paper on human recognition of children’s gestures accepted to ICMI! »Alex gets a paper accepted to ICMI DC!
August 24, 2017In previous posts, we have discussed our ongoing work on improving recognition of children’s touchscreen gestures. My paper, “Human-Centered Recognition of Children’s Touchscreen Gestures”, was accepted to ICMI 2017’s Doctoral Consortium! The paper focused on my future research plans as I continue to work on my doctorate. Here is the abstract: Touchscreen gestures are an important […]
Read more: Alex gets a paper accepted to ICMI DC! »Understanding Gestures Project: Human recognition of children’s gestures
June 12, 2017We are currently continuing our work in gesture recognition by studying how well humans can recognize children’s gestures. We will compare human recognition rates to the rates of the automated recognition algorithms we used in our previous work. This will help us get an idea of how well humans are able to recognize children’s gestures. That way, […]
Read more: Understanding Gestures Project: Human recognition of children’s gestures »Understanding Gestures Project: ICMI paper accepted!
September 23, 2016In a previous post, we discussed our ongoing work on studying children’s gestures. We studied a corpus of children’s and adults’ gestures and analyzed 22 different articulation features, which we are pleased to announce has been accepted for publication at the 2016 ACM International Conference on Multimodal Interaction (ICMI). Our paper, “Analyzing the Articulation Features […]
Read more: Understanding Gestures Project: ICMI paper accepted! »Understanding Gestures Project: Late Breaking Work on children’s gestures accepted to CHI!
February 25, 2016One of the projects our lab has been working on has been a qualitative analysis of the children’s gesture data from our MTAGIC project. We submitted an extended abstract about this work to CHI and it was accepted! In the extended abstract, we detail the tools we have applied thus far to study the children’s […]
Read more: Understanding Gestures Project: Late Breaking Work on children’s gestures accepted to CHI! »Understanding Gestures Project: Gesture recognition experiment suite
July 9, 2015In my last post I discussed some work on the $-family of recognizers. Since then, I’ve been working on designing a standalone application for running gesture recognition experiments using the $-family. The application will also allow the user to design custom recognizers. I’ll also be using the $-family to run recognition experiments on the data […]
Read more: Understanding Gestures Project: Gesture recognition experiment suite »Understanding Gestures Project: Collecting and analyzing gestures from children
March 13, 2015The $-family of recognizers are lightweight, easy to implement gesture recognizers that allow for quick development of 2-D gesture-based interfaces. These algorithms are short (less than 100 lines of code each) allowing for easy incorporation by developers into new projects. These algorithms currently achieve 98-99% accuracy for recognizing gestures made by adults, but only about […]
Read more: Understanding Gestures Project: Collecting and analyzing gestures from children »