Papers
Topics
Authors
Recent
Search
2000 character limit reached

Graph Edge Convolutional Neural Networks for Skeleton Based Action Recognition

Published 16 May 2018 in cs.CV | (1805.06184v2)

Abstract: This paper investigates body bones from skeleton data for skeleton based action recognition. Body joints, as the direct result of mature pose estimation technologies, are always the key concerns of traditional action recognition methods. However, instead of joints, we humans naturally identify how the human body moves according to shapes, lengths and places of bones, which are more obvious and stable for observation. Hence given graphs generated from skeleton data, we propose to develop convolutions over graph edges that correspond to bones in human skeleton. We describe an edge by integrating its spatial neighboring edges to explore the cooperation between different bones, as well as its temporal neighboring edges to address the consistency of movements in an action. A graph edge convolutional neural network is then designed for skeleton based action recognition. Considering the complementarity between graph node convolution and graph edge convolution, we additionally construct two hybrid neural networks to combine graph node convolutional neural network and graph edge convolutional neural network using shared intermediate layers. Experimental results on Kinetics and NTU-RGB+D datasets demonstrate that our graph edge convolution is effective to capture characteristic of actions and our graph edge convolutional neural network significantly outperforms existing state-of-art skeleton based action recognition methods. Additionally, more performance improvements can be achieved by the hybrid networks.

Citations (151)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.