Natural Multimodal Authentication for Smart Environments Project: MMGatorAuth Dataset

If you are interested in acquiring the full MMGatorAuth multimodal authentication dataset, please note that the dataset is nearly 1Tb in size due to the large video files from the Kinect sensor. Interested researchers should be prepared for a lengthy download OR to ship a USB stick or hard drive to the University of Florida which we will ship back to you once the dataset has been copied to your media. To get started, please email the project principal investigator, Dr. Lisa Anthony, for further instructions.

Quantitative Methods in Child-Computer Interaction

Dr. Lisa Anthony and our lab have been applying quantitative methods to studying child-computer interaction for the last 15 years. Recently at IDC2019, Lisa gave a conference course on the basics of quantitative methodology and how to adapt it for working with children. We are providing the course notes and sample data used in this course here for those who may be interested in the topic.


Course Description: anthony-IDC2019-course.pdf
Course Notes: download link on Dropbox
Sample Data: download link on Dropbox

POSE Project Kinder-Gator Dataset

In the POSE project, we conducted a study with 10 children and 10 adults performing 58 motions forward facing the Kinect. We created a dataset, Kinder-Gator, which contains RGB videos of the participants performing the motions and Skeleton data, which contains the positions of the joints along the x: horizontal, y: vertical, and z: depth dimensions as the motion is being performed. The dataset is now available for download.


Dataset: Kinder-Gator Dataset and RGB Videos

MTAGIC Children’s Gesture Corpus

Our NSF-funded project on children’s touchscreen interactions, which we call “MTAGIC” (Mobile Touch and Gesture Interaction for Children) was funded from 2012 through 2017. Over the course of this project we collected a large corpus of touchscreen gestures from over 120 children and 60 adults.

Download links coming soon!

Human Recognition Dataset Results

In our ICMI 2017 paper, we conducted a study to examine how well humans (i.e., Amazon Mechanical Turkers) could classify children’s touchscreen gestures compared to a common machine algorithm. We used a subset of the MTAGIC Children’s Gesture Corpus (see links above) to conduct this test. We collected a dataset of responses from 127 adults who attempted to classify touchscreen gestures from 26 children ages 5 to 10. We make available the dataset of our human recognition tests here for interested researchers.


Dataset: Human-Recognition Dataset

MTAGIC UI Guidelines App

This app is the demo of the findings of the research paper Interaction and Recognition Challenges in Interpreting Children’s Touch and Gesture Input on Mobile Devices. In MTAGIC project lab studies, we have found evidence that children have more difficulty successfully acquiring touch targets and making consistent gestures than adults do. We have also found out issues related to mobile application design and came up with recommendations for better UI design for mobile devices, especially for children. This open-source app is the demonstration of our findings. Here is a demo video of the app. You can download the app for better understanding of the recommendations. You can also download the source code to directly embed the code in your app!


APK File: mtagic-design-app.apk
Source Code: MTAGIC Design App-BitBucket

Other Downloads

$Q recognizer
$P recognizer
$N and $N-Protractor recognizers
GHoST gesture heatmaps toolkit
GREAT gesture execution toolkit
GECKo gesture clustering toolkit
MMG (mixed multistroke gestures) data corpus download link