INIT Lab collaboration on smart, wearables-based COVID-19 mitigation techniques leads to journal article in Sensors! #laterpub #latertweet

In a previous tweet from 2020, INIT lab director Lisa Anthony shared the news that we were collaborating with UF Health researchers, including Dr. Mamoun Mardini from UF College of Medicine’s Department of Aging & Geriatric Research, to investigate the use of smartwatches and other wearables to detect problematic face-touching behaviors that could contribute to the spread of COVID-19 and other respiratory illnesses. Since then, we have published two papers on this project, including a September 2021 article in the open-access journal Sensors, and a Late-Breaking Work at the ACM International Conference on Multimodal Interaction (ICMI) 2021 conference in October 2021.

The two papers reflect the two different approaches that our collaborative team explored for the rapid, accurate, and real-time detection of face touching behaviors for users wearing a smartwatch on their dominant wrist. Here are the abstracts for each paper, first for the Sensors journal article from 2021 with PhD student Chen Bai as first author:

Frequent spontaneous facial self-touches, predominantly during outbreaks, have the theoretical potential to be a mechanism of contracting and transmitting diseases. Despite the recent advent of vaccines, behavioral approaches remain an integral part of reducing the spread of COVID-19 and other respiratory illnesses. The aim of this study was to utilize the functionality and the spread of smartwatches to develop a smartwatch application to identify motion signatures that are mapped accurately to face touching. Participants (n = 10, five women, aged 20–83) performed 10 physical activities classified into face touching (FT) and non-face touching (NFT) categories in a standardized laboratory setting. We developed a smartwatch application on Samsung Galaxy Watch to collect raw accelerometer data from participants. Data features were extracted from consecutive non-overlapping windows varying from 2 to 16 s. We examined the performance of state-of-the-art machine learning methods on face-touching movement recognition (FT vs. NFT) and individual activity recognition (IAR): logistic regression, support vector machine, decision trees, and random forest. While all machine learning models were accurate in recognizing FT categories, logistic regression achieved the best performance across all metrics (accuracy: 0.93 ± 0.08, recall: 0.89 ± 0.16, precision: 0.93 ± 0.08, F1-score: 0.90 ± 0.11, AUC: 0.95 ± 0.07) at the window size of 5 s. IAR models resulted in lower performance, where the random forest classifier achieved the best performance across all metrics (accuracy: 0.70 ± 0.14, recall: 0.70 ± 0.14, precision: 0.70 ± 0.16, F1-score: 0.67 ± 0.15) at the window size of 9 s. In conclusion, wearable devices, powered by machine learning, are effective in detecting facial touches. This is highly significant during respiratory infection outbreaks as it has the potential to limit face touching as a transmission vector.

Second, the ICMI 2021 Late-Breaking Work poster with PhD student Yu-Peng Chen as first author:

Respiratory diseases such as the novel coronavirus (COVID-19) can be transmitted through people’s face-touching behaviors. One of the official recommendations for protecting ourselves from such viruses is to avoid touching our eyes, nose, or mouth with unwashed hands. However, prior work has found that people touch their face 23 times per hour on average without realizing it. Therefore, in this Late-Breaking Work, we explore a possible approach to help users avoid touching their face in daily life by alerting them through a smartwatch application every time a face-touching behavior occurs. We selected 10 everyday activities including several that should be easy to distinguish from face touching and several that should be more challenging. We recruited 10 participants and asked them to perform each activity repeatedly for 3 minutes at their own pace while wearing a Samsung smartwatch. Based on the collected accelerometer data, we used dynamic time warping (DTW) to distinguish between the two groups of activities (i.e., face-touching and non-face-touching), which is a method well-suited for small datasets. Our findings show that the DTW-based classifier is capable of classifying the activities into two groups with high accuracy (i.e., 99.07% for the user-dependent scenario). We demonstrated that smartwatches have the potential to detect face-touching behaviors with the proposed methodology. Future work can explore other classification approaches, collect larger datasets, and consider other sensors to increase the robustness of our results.

Yu-Peng attended the ICMI 2021 conference (virtually) to present this poster. We look forward to future results from this project and collaboration! For more information on these papers, see the PDFs (to be linked soon on our Publications page).