skip to main content
10.1145/3211960.3211972acmconferencesArticle/Chapter ViewAbstractPublication PagesmobisysConference Proceedingsconference-collections
research-article

Understanding activity segmentation for multi-sport competitions

Published: 10 June 2018 Publication History

Abstract

Despite the advances in activity detection, their applications in the sports domain are limited. Athletic environments are fast and challenging. Athletes often perform more than one activity in a single workout, especially if they are training for a multi-sport competition, such as a triathlon. These competitions require an athlete to transition from one activity to another quickly. Current logging applications require the user to select the activity they are about to perform, and start and stop the timer for each activity. This could increase the athlete's transition time, and it would not give the athlete an estimate of how much time they spent in transition between the activities.
This paper explores activity segmentation for multi-sport scenarios. Our goal is to identify the activities and segment a user's workout trace into constituent activities, including the transition periods. We use an Apple Watch to gather inertial sensor data and validate our system in the context of a triathlon. The system was trained and tested on 3 activities (running, biking, and swimming), as well as simple actions performed in transition, from 5 different participants. Our system achieves 91% accuracy in detecting the activity, and can accurately identify the start and stop times for each. We also validate our results with data collected from a volunteer at a triathlon.

References

[1]
J. Parkka, M. Ermes, P. Korpipaa, J. Mantyjarvi, J. Peltola, and I. Korhonen. Activity classification using realistic data from wearable sensors. Trans. Info. Tech. Biomed., 10(1):119--128, January 2006.
[2]
Seon-Woo Lee and Kenji Mase. Activity and location recognition using wearable sensors. IEEE Pervasive Computing, 1(3):24--32, July 2002.
[3]
Stephen J Preece, John Y Goulermas, Laurence P J Kenney, Dave Howard, Kenneth Meijer, and Robin Crompton. Activity identification using body-mounted sensors-a review of classification techniques. Physiological Measurement, 30(4):R1, 2009.
[4]
Andrea Mannini, Stephen S. Intille, Mary Rosenberger, Angelo M. Sabatini, and William Haskell. Activity recognition using a single accelerometer placed at the wrist or ankle. Med Sci Sports Exerc., 45(11):2193--2203, November 2014.
[5]
M. Ermes, J. Parkka, J. Mantyjarvi, and I. Korhonen. Detection of daily activities and sports with wearable sensors in controlled and uncontrolled conditions. Trans. Info. Tech. Biomed., 12(1):20--26, January 2008.
[6]
P. Siirtola, P. Laurinen, J. Röning, and H. Kinnunen. Efficient accelerometer-based swimming exercise tracking. In 2011 IEEE Symposium on Computational Intelligence and Data Mining (CIDM), pages 156--161, April 2011.
[7]
Garmin: Forerunner 935, 2018. https://buy.garmin.com/en-US/US/p/564291.
[8]
Roberto Cejuela, Antonio Cala, José A Pérez-Turpin, José G Villa, Juan M Cortell, and Juan J Chinchilla. Temporal activity in particular segments and transitions in the olympic triathlon. Journal of human kinetics, 36:87--95, March 2013.
[9]
Marc Bächlin and Gerhard Tröster. Swimming performance and technique evaluation with wearable acceleration sensors. Pervasive and Mobile Computing, 8(1):68 -- 81, 2012.
[10]
Neil Davey, Megan Anderson, and Daniel A. James. Validation trial of an accelerometer-based sensor platform for swimming. Sports Technology, 1(4--5):202--207.
[11]
J. Yang, Shuangquan Wang, Ningjiang Chen, Xin Chen, and P. Shi. Wearable accelerometer based extendable activity recognition system. In 2010 IEEE International Conference on Robotics and Automation, pages 3641--3647, May 2010.
[12]
O. Thomas, P. Sunehag, G. Dror, S. Yun, S. Kim, M. Robards, A. Smola, D. Green, and P. Saunders. Wearable sensor activity analysis using semi-markov models with a grammar. Pervasive and Mobile Computing, 6(3):342 -- 350, 2010.
[13]
J.K. Aggarwal and M.S. Ryoo. Human activity analysis: A review. ACM Comput. Surv., 43(3):16:1--16:43, April 2011.
[14]
Andrea Mannini and Angelo Maria Sabatini. Machine learning methods for classifying human physical activity from on-body accelerometers. Sensors, 10(2):1154--1175, 2010.
[15]
Guangming Zhu, Liang Zhang, Peiyi Shen, and Juan Song. An online continuous human action recognition algorithm based on the kinect sensor. Sensors, 16(2), 2016.
[16]
FengNiu and M. Abdel-Mottaleb. Hmm-based segmentation and recognition of human activities from video sequences. In 2005 IEEE International Conference on Multimedia and Expo, pages 804--807, July 2005.
[17]
A. Ali and J. K. Aggarwal. Segmentation and recognition of continuous human activity. In Proceedings IEEE Workshop on Detection and Recognition of Events in Video, pages 28--35, 2001.
[18]
H. Zhang, W. Zhou, and L. E. Parker. Fuzzy segmentation and recognition of continuous human activities. In 2014 IEEE International Conference on Robotics and Automation (ICRA), pages 6305--6312, May 2014.
[19]
STRAVA: About us, 2018. https://www.strava.com/about.
[20]
RunKeeper: Everyone. every run., 2018. https://runkeeper.com/#where-to-start-module.
[21]
Xing Su, Hanghang Tong, and Ping Ji. Activity recognition with smartphone sensors. Tsinghua Science and Technology, 19(3):235--249, 2014.

Cited By

View all
  • (2024)Deep similarity segmentation model for sensor-based activity recognitionMultimedia Tools and Applications10.1007/s11042-024-18933-2Online publication date: 4-May-2024
  • (2023)A Method to Estimate Obstacle Presence on Narrow Roads Using Smartwatch-Based Multi- Cycling DataProceedings of the 2023 11th International Conference on Information Technology: IoT and Smart City10.1145/3638985.3639025(247-254)Online publication date: 14-Dec-2023
  • (2023)Similarity Segmentation Approach for Sensor-Based Activity RecognitionIEEE Sensors Journal10.1109/JSEN.2023.329577823:17(19704-19716)Online publication date: 1-Sep-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
WearSys '18: Proceedings of the 4th ACM Workshop on Wearable Systems and Applications
June 2018
64 pages
ISBN:9781450358422
DOI:10.1145/3211960
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 June 2018

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

MobiSys '18
Sponsor:

Acceptance Rates

Overall Acceptance Rate 28 of 36 submissions, 78%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Deep similarity segmentation model for sensor-based activity recognitionMultimedia Tools and Applications10.1007/s11042-024-18933-2Online publication date: 4-May-2024
  • (2023)A Method to Estimate Obstacle Presence on Narrow Roads Using Smartwatch-Based Multi- Cycling DataProceedings of the 2023 11th International Conference on Information Technology: IoT and Smart City10.1145/3638985.3639025(247-254)Online publication date: 14-Dec-2023
  • (2023)Similarity Segmentation Approach for Sensor-Based Activity RecognitionIEEE Sensors Journal10.1109/JSEN.2023.329577823:17(19704-19716)Online publication date: 1-Sep-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media