UbiComp '19: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

GymCam: Detecting, recognizing, and tracking simultaneous exercises in unconstrained scenes

Rushil Khurana*, Karan Ahuja*, Zac Yu, Jennifer Mankoff, Chris Harrison, Mayank Goel

Abstract

Worn sensors are popular for automatically tracking exercises. However, a wearable is usually attached to one part of the body, tracks only that location, and thus is inadequate for capturing a wide range of exercises, especially when other limbs are involved. Cameras, on the other hand, can fully track a user's body, but suffer from noise and occlusion. We present GymCam, a camera-based system for automatically detecting, recognizing and tracking multiple people and exercises simultaneously in unconstrained environments without any user intervention. We collected data in a varsity gym, correctly segmenting exercises from other activities with an accuracy of 84.6%, recognizing the type of exercise at 93.6% accuracy, and counting the number of repetitions to within +-1.7 on average. GymCam advances the field of real-time exercise tracking by filling some crucial gaps, such as tracking whole body motion, handling occlusion, and enabling single-point sensing for a multitude of users.

Citation

Khurana, R., Ahuja, K., Yu, Z., Mankoff, J., Harrison, C., & Goel, M. (2018). GymCam: Detecting, recognizing and tracking simultaneous exercises in unconstrained scenes. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(4), 1-17.

BibTeX

@article{khurana2018gymcam,
title={GymCam: Detecting, recognizing and tracking simultaneous exercises in unconstrained scenes},
author={Khurana, Rushil and Ahuja, Karan and Yu, Zac and Mankoff, Jennifer and Harrison, Chris and Goel, Mayank},
journal={Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies},
volume={2},
number={4},
pages={1--17},
year={2018},
publisher={ACM New York, NY, USA}
}