Decoding Human Activities: Analyzing Wearable Accelerometer and Gyroscope Data for Activity Recognition
Abstract: A person's movement or relative positioning can be effectively captured by different types of sensors and corresponding sensor output can be utilized in various manipulative techniques for the classification of different human activities. This letter proposes an effective scheme for human activity recognition, which introduces two unique approaches within a multi-structural architecture, named FusionActNet. The first approach aims to capture the static and dynamic behavior of a particular action by using two dedicated residual networks and the second approach facilitates the final decision-making process by introducing a guidance module. A two-stage training process is designed where at the first stage, residual networks are pre-trained separately by using static (where the human body is immobile) and dynamic (involving movement of the human body) data. In the next stage, the guidance module along with the pre-trained static or dynamic models are used to train the given sensor data. Here the guidance module learns to emphasize the most relevant prediction vector obtained from the static or dynamic models, which helps to effectively classify different human activities. The proposed scheme is evaluated using two benchmark datasets and compared with state-of-the-art methods. The results clearly demonstrate that our method outperforms existing approaches in terms of accuracy, precision, recall, and F1 score, achieving 97.35% and 95.35% accuracy on the UCI HAR and Motion-Sense datasets, respectively which highlights both the effectiveness and stability of the proposed scheme.
- “A blockchain-based fog computing framework for activity recognition as an application to e-healthcare services,” Future Generation Computer Systems, vol. 100, pp. 569–578, 2019.
- “Robust human activity recognition from depth video using spatiotemporal multi-fused features,” Pattern recognition, vol. 61, pp. 295–308, 2017.
- “Ann-based performance analysis on human activity recognition,” in 2019 4th International Conference and Workshops on Recent Advances and Innovations in Engineering (ICRAIE). IEEE, 2019, pp. 1–6.
- “A hybrid deep convolutional and recurrent neural network for complex activity recognition using multimodal sensors,” Neurocomputing, vol. 362, pp. 33–40, 2019.
- “Enhanced complex human activity recognition system: A proficient deep learning framework exploiting physiological sensors and feature learning,” IEEE Sensors Letters, vol. 7, no. 11, pp. 1–4, 2023.
- “Light residual network for human activity recognition using wearable sensor data,” IEEE Sensors Letters, vol. 7, no. 10, pp. 1–4, 2023.
- “A triaxial accelerometer-based physical-activity recognition via augmented-signal features and a hierarchical recognizer,” IEEE transactions on information technology in biomedicine, vol. 14, no. 5, pp. 1166–1172, 2010.
- “A public domain dataset for human activity recognition using smartphones.,” in Esann, 2013, vol. 3, p. 3.
- “Protecting sensory data against sensitive inferences,” in Proceedings of the 1st Workshop on Privacy by Design in Distributed Systems, 2018, pp. 1–6.
- “A semi-automatic annotation approach for human activity recognition,” Sensors, vol. 19, no. 3, pp. 501, 2019.
- “Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine,” in International workshop on ambient assisted living. Springer, 2012, pp. 216–223.
- “Hand-crafted features vs residual networks for human activities recognition using accelerometer,” in 2019 IEEE 23rd International Symposium on Consumer Technologies (ISCT). IEEE, 2019, pp. 153–156.
- “Stacked lstm network for human activity recognition using smartphone data,” in 2019 8th European workshop on visual information processing (EUVIP). IEEE, 2019, pp. 175–180.
- “A cnn-lstm approach to human activity recognition,” in 2020 international conference on artificial intelligence in information and communication (ICAIIC). IEEE, 2020, pp. 362–366.
- “isplinception: An inception-resnet deep learning architecture for human activity recognition,” IEEE Access, vol. 9, pp. 68985–69001, 2021.
- “Deep learning models for real-time human activity recognition with smartphones,” Mobile Networks and Applications, vol. 25, pp. 743–755, 2020.
- “Human activity recognition via wearable devices using enhanced ternary weight convolutional neural network,” Pervasive and Mobile Computing, vol. 83, pp. 101620, 2022.
- “A novel deep multifeature extraction framework based on attention mechanism using wearable sensor data for human activity recognition,” IEEE Sensors Journal, vol. 23, no. 7, pp. 7188–7198, 2023.
- “Deep residual bidir-lstm for human activity recognition using wearable sensors,” Mathematical Problems in Engineering, vol. 2018, pp. 1–13, 2018.
- “Sensors technologies for human activity analysis based on svm optimized by pso algorithm,” in 2019 international conference on applied and engineering mathematics (ICAEM). IEEE, 2019, pp. 145–150.
- “Stochastic recognition of physical activity and healthcare using tri-axial inertial wearable sensors,” Applied Sciences, vol. 10, no. 20, pp. 7122, 2020.
- “Multi-task self-supervised learning for human activity detection,” Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 3, no. 2, pp. 1–30, 2019.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.