Welcome to the INIT Lab!

The INIT Lab (the Intelligent Natural Interaction Technology Laboratory) focuses on advanced interaction technologies such as touch, speech, and gesture, especially for children in the context of educational interfaces. INIT Lab projects advance human-computer interaction (HCI) research questions of how users want to interact with these natural modalities, and computer science research questions of how to build recognition algorithms that can understand user input in these ambiguous modalities.

Natural user interaction (NUI) focuses on allowing users to interact with technology through the wide range of human abilities, such as touch, voice, vision, and motion. Children are still developing their cognitive and physical capabilities, creating unique design challenges and opportunities for interacting in these modalities. Research in the INIT Lab both integrates and contributes to research in human-computer interaction, child-computer interaction, multimodal interaction, machine learning and artificial intelligence, cognitive science, and interaction design. We seek to understand children’s expectations and abilities with respect to NUIs, and to use this understanding to design and develop new multimodal NUIs for children in a variety of contexts, including education, healthcare and serious games.

The INIT Lab is directed by Dr. Lisa Anthony, and is part of the CISE Department at the University of Florida.

————————— Projects —————————

We are studying patterns in how diverse users actually make gestures on touchscreen devices, and are developing tools and techniques to reveal these patterns. We have collaborated with multiple people on this line of inquiry: Jacob Wobbrock, Radu-Daniel Vatavu, Leah Findlater, and Quincy Brown.

Understanding Gestures (GECKo and GREAT)

We are studying patterns in how diverse users actually make gestures on touchscreen devices, and are developing tools and techniques to reveal these patterns. We have collaborated with multiple people on this line of inquiry: Jacob Wobbrock, Radu-Daniel Vatavu, Leah Findlater, and Quincy Brown.

Touch Interaction for Data Engagement with Science on Spheres (TIDESS)

The purpose of the TIDESS project is to investigate ways that children and adults engage with and learn from large-scale interactive displays (touch table and touch spherical display) in a science museum context. We have created a prototype on a Microsoft Surface touch table that displays ocean temperature visualizations for ...

We are looking to discover and characterize differences in how children and adults make natural and prompted body movements when using whole-body interaction systems such as the Microsoft Kinect.

Kids Pose Project

We are looking to discover and characterize differences in how children and adults make natural and prompted body movements when using whole-body interaction systems such as the Microsoft Kinect.

The project has been expanded to develop exergames for kids to leverage their interest in video games to help them lead a more active lifestyle. We are working with local kids who have varying levels of interest and motivation regarding exercise or games to learn best how to appeal to a wide range of potential users. Additionally, we are working with physical education teachers and sports coaches to understand exercise recommendations for kids.

Fun Fit Tech (Kinect Games for Exercise)

The project has been expanded to develop exergames for kids to leverage their interest in video games to help them lead a more active lifestyle. We are working with local kids who have varying levels of interest and motivation regarding exercise or games to learn best how to appeal to ...

We have been investigating differences in the ways that children use touch and gesture interactions compared to adults, especially on mobile devices. In lab studies, we have found evidence that children have more difficulty successfully acquiring touch targets and making consistent gestures than adults do. These differences can lead to poorer performance of the interface for children, and we plan to explore ways to adapt interfaces to work better with children in the real world given these differences. We have been working with Quincy Brown at Bowie State University.

Mobile Touch and Gesture Interaction for Children (MTAGIC)

We have been investigating differences in the ways that children use touch and gesture interactions compared to adults, especially on mobile devices. In lab studies, we have found evidence that children have more difficulty successfully acquiring touch targets and making consistent gestures than adults do. These differences can lead to ...