A-MAL: Automatic Movement Assessment Learning from Properly Performed Movements in 3D Skeleton Videos
Abstract: The task of assessing movement quality has recently gained high demand in a variety of domains. The ability to automatically assess subject movement in videos that were captured by affordable devices, such as Kinect cameras, is essential for monitoring clinical rehabilitation processes, for improving motor skills and for movement learning tasks. The need to pay attention to low-level details while accurately tracking the movement stages, makes this task very challenging. In this work, we introduce A-MAL, an automatic, strong movement assessment learning algorithm that only learns from properly-performed movement videos without further annotations, powered by a deviation time-segmentation algorithm, a parameter relevance detection algorithm, a novel time-warping algorithm that is based on automatic detection of common temporal points-of-interest and a textual-feedback generation mechanism. We demonstrate our method on movements from the Fugl-Meyer Assessment (FMA) test, which is typically held by occupational therapists in order to monitor patients' recovery processes after strokes.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.