Real-Time Prediction of Lower Limb Joint Kinematics, Kinetics, and Ground Reaction Force using Wearable Sensors and Machine Learning
Abstract: Walking is a key movement of interest in biomechanics, yet gold-standard data collection methods are time- and cost-expensive. This paper presents a real-time, multimodal, high sample rate lower-limb motion capture framework, based on wireless wearable sensors and machine learning algorithms. Random Forests are used to estimate joint angles from IMU data, and ground reaction force (GRF) is predicted from instrumented insoles, while joint moments are predicted from angles and GRF using deep learning based on the ResNet-16 architecture. All three models achieve good accuracy compared to literature, and the predictions are logged at 1 kHz with a minimal delay of 23 ms for 20s worth of input data. The present work fully relies on wearable sensors, covers all five major lower limb joints, and provides multimodal comprehensive estimations of GRF, joint angles, and moments with minimal delay suitable for biofeedback applications.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.