RGBDGaze: Gaze Tracking on Smartphones with RGB and Depth Data
Riku Arakawa, Mayank Goel, Chris Harrison, Karan Ahuja
Abstract
Providing university teachers with high-quality opportunities for professional development cannot happen without data about the classroom environment. Currently, the most effective mechanism is for an expert to observe one or more lectures and provide personalized formative feedback to the instructor. Of course, this is expensive and unscalable, and perhaps most critically, precludes a continuous learning feedback loop for the instructor. In this paper, we present the culmination of two years of research and development on EduSense, a comprehensive sensing system that produces a plethora of theoretically-motivated visual and audio features correlated with effective instruction, which could feed professional development tools in much the same way as a Fitbit sensor reports step count to an end user app. Although previous systems have demonstrated some of our features in isolation, EduSense is the first to unify them into a cohesive, real-time, in-the-wild evaluated, and practically-deployable system. Our two studies quantify where contemporary machine learning techniques are robust, and where they fall short, illuminating where future work remains to bring the vision of automated classroom analytics to reality.
Additional Resources
Learn more about EduSense project, see our project website: https://www.edusense.io/
Citation
Arakawa, R., Goel, M., Harrison, C., & Ahuja, K. (2022, November). Rgbdgaze: Gaze tracking on smartphones with RGB and depth data. In Proceedings of the 2022 International Conference on Multimodal Interaction (pp. 329-336).
BibTeX
@inproceedings{arakawa2022rgbdgaze,
title={Rgbdgaze: Gaze tracking on smartphones with RGB and depth data},
author={Arakawa, Riku and Goel, Mayank and Harrison, Chris and Ahuja, Karan},
booktitle={Proceedings of the 2022 International Conference on Multimodal Interaction},
pages={329--336},
year={2022}
}