Alex Shaw, PhD in Computer Engineering, TBD

Alex Shaw is a member of the INIT Lab and is currently working on the $-Family of Gesture Recognizers project. These are designed to be lightweight, easy to implement recognizers for UI prototypes on touch screens. His work focuses on improving these recognition algorithms to have higher accuracy with gestures provided by children. Alex’s research interests include computer science education, artificial intelligence, intelligent systems, and human-computer interaction.

Information

Email: alexshaw@ufl.edu
Position: Graduate Research Assistant
Projects: $-Family of Gesture Recognizers

Papers

Shaw, A. and Anthony, L. 2016. Analyzing the articulation features of children’s touchscreen gestures. Proceedings of the International Conference on Multimodal Interaction (ICMI’2016), Tokyo, Japan [PDF]

Jain, E., Anthony, L., Aloba, A., Castonguay, A., Cuba, I., Shaw, A., and Woodward, J. 2016. Is the motion of a child perceivably different from the motion of an adult? ACM Transactions on Applied Perception, Volume 13, Issue 4, Article 22, July 2016. [PDF]

Shaw, A. and Anthony, L. 2016. Toward a Systematic Understanding of Children’s Touchscreen Gestures. Extended Abstracts of the ACM Conference on Human Factors in Computing Systems (CHI’2016) , San Jose, CA, 7 May 2016, p.1752-1759. [PDF and Poster]

Woodward, J., Shaw, A., Luc, A., Craig, B., Das, J., Hall Jr, P., Holla, A., Irwin, G., Sikich, D., Brown, Q., Anthony, L. 2016. Characterizing How Interface Complexity Affects Children’s Touchscreen Interactions. ACM Conference on Human Factors in Computing (CHI’2016), San Jose, CA, 7 May 2016, p.1921-1933. [PDF]

Blogposts

Understanding Gestures Project: Creating an annotation tool to calculate new features

Over the past months, I have continued my work on the understanding gestures project by working on developing a set of new articulation features based on how children make touchscreen gestures. Our prior work has shown that children’s gestures are not recognized as well as adults’ gestures, which led us […]

0 comments

Alex Passes Dissertation Proposal

On May 29, I completed and passed my PhD dissertation proposal defense. The proposal defense process can vary widely among institutions and even among departments in the same institution, so in this post I outline the process I followed in the CISE department at UF. The first step I followed […]

0 comments

Understanding Gestures Project: Paper on human recognition of children’s gestures accepted to ICMI!

In a previous post, we discussed our ongoing work on studying children’s gestures. To get a better idea of the target accuracy for continuing work in gesture recognition, we ran a study comparing human ability to recognize children’s gestures to machine recognition. Our paper, “Comparing Human and Machine Recognition of […]

1 comment

Alex gets a paper accepted to ICMI DC!

In previous posts, we have discussed our ongoing work on improving recognition of children’s touchscreen gestures. My paper, “Human-Centered Recognition of Children’s Touchscreen Gestures”, was accepted to ICMI 2017’s Doctoral Consortium! The paper focused on my future research plans as I continue to work on my doctorate. Here is the abstract: […]

0 comments

Understanding Gestures Project: Human recognition of children’s gestures

We are currently continuing our work in gesture recognition by studying how well humans can recognize children’s gestures. We will compare human recognition rates to the rates of the automated recognition algorithms we used in our previous work. This will help us get an idea of how well humans are able to […]

1 comment

Understanding Gestures Project: ICMI paper accepted!

In a previous post, we discussed our ongoing work on studying children’s gestures. We studied a corpus of children’s and adults’ gestures and analyzed 22 different articulation features, which we are pleased to announce has been accepted for publication at the 2016 ACM International Conference on Multimodal Interaction (ICMI). Our […]

1 comment

Understanding Gestures Project: Late Breaking Work on children’s gestures accepted to CHI!

One of the projects our lab has been working on has been a qualitative analysis of the children’s gesture data from our MTAGIC project. We submitted an extended abstract about this work to CHI and it was accepted! In the extended abstract, we detail the tools we have applied thus […]

1 comment

MTAGIC Project: Gesture analysis

In our last post we discussed how we were working on replicating analyses from previous studies. We have completed these analyses and written and submitted a paper on our findings. When our paper is accepted, we will post the abstract and announce our findings! Since then, we’ve begun exploring the […]

1 comment

Understanding Gestures Project: Gesture recognition experiment suite

In my last post I discussed some work on the $-family of recognizers. Since then, I’ve been working on designing a standalone application for running gesture recognition experiments using the $-family. The application will also allow the user to design custom recognizers. I’ll also be using the $-family to run […]

1 comment

Understanding Gestures Project: Collecting and analyzing gestures from children

The $-family of recognizers are lightweight, easy to implement gesture recognizers that allow for quick development of 2-D gesture-based interfaces. These algorithms are short (less than 100 lines of code each) allowing for easy incorporation by developers into new projects. These algorithms currently achieve 98-99% accuracy for recognizing gestures made […]

1 comment