ICMI '21: Proceedings of the 2021 International Conference on Multimodal Interaction

EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices

Andy Kong, Karan Ahuja, Mayank Goel, Chris Harrison

Abstract

As smartphone screens have grown in size, single-handed use has become more cumbersome. Interactive targets that are easily seen can be hard to reach, particularly notifications and upper menu bar items. Users must either adjust their grip to reach distant targets, or use their other hand. In this research, we show how gaze estimation using a phone’s user-facing camera can be paired with IMU-tracked motion gestures to enable a new, intuitive, and rapid interaction technique on handheld phones. We describe our proof-of-concept implementation and gesture set, built on state-of-the-art techniques and capable of self-contained execution on a smartphone. In our user study, we found a mean euclidean gaze error of 1.7 cm and a seven-class motion gesture classification accuracy of 97.3%.

Citation

Kong, A., Ahuja, K., Goel, M., & Harrison, C. (2021, October). Eyemu interactions: Gaze+ imu gestures on mobile devices. In Proceedings of the 2021 International Conference on Multimodal Interaction (pp. 577-585).

BibTeX

@inproceedings{kong2021eyemu,
title={Eyemu interactions: Gaze+ imu gestures on mobile devices},
author={Kong, Andy and Ahuja, Karan and Goel, Mayank and Harrison, Chris},
booktitle={Proceedings of the 2021 International Conference on Multimodal Interaction},
pages={577--585},
year={2021}
}