Comparative Analysis of Human Activity Recognition for Karate Skills Using IMU Raw Data and Derived Body Joint Angles

Main Article Content

Heba Ashour, Ahmad Salah, Ahmed Fathalla, Esraa Eldesouky, Abdelwahab Said Hassan

Abstract

Human Activity Recognition (HAR) is a machine learning (ML) application that plays a vital role in various fields such as medicine and sports. Recent advancements have made HAR a well-established field. HAR can be performed using imagery or sensory data; this study focuses on sensory data generated by the Inertial Measurement Units (IMUs). HAR typically requires data from at least one sensor. However, raw sensor data from two sensors—specifically raw sensors placed on two joint body joints- are needed to calculate a joint's three-axis angles. To our knowledge, no evidence in the literature compares the performance of HAR models trained on raw sensory data versus those trained on body joint angles in sports science. In this study, we aim to bridge this research gap by studying the difference in training on both data types in the karate sport as a case study. The current work includes two comprehensive four karate skills datasets: one consisting of raw sensory data and the other comprising calculated body joint angles derived from the raw data. The datasets were collected from professional karate players performing four distinct skills. We proposed a normalization algorithm to address the different numbers of readings per player performing the same karate skill. Next, several ML models were trained on the normalized versions of both datasets to determine which dataset contains more easily predictable patterns. The results indicate that the body joint angles dataset exhibited significantly higher accuracy than the raw sensory dataset. Moreover, the proposed normalization algorithm demonstrated promising results across all models and effectively mitigated the overfitting issue in both datasets.

Article Details

Section
Articles