A Novel Motion Recognition Method Based on Improved Two-stream Convolutional Neural Network and Sparse Feature Fusion
- Sports Institute, Henan University of Technology
Zhengzhou City, 470001 China
byoungholee@qq.com
Abstract
Motion recognition is a hot topic in the field of computer vision. It is a challenging task. Motion recognition analysis is closely related to the network input, network structure and feature fusion. Due to the noise in the video, traditional methods cannot better obtain the feature information resulting in the problem of inaccurate motion recognition. Feature selection directly affects the efficiency of recognition, and there are still many problems to be solved in the multi-level feature fusion process. In this paper, we propose a novel motion recognition method based on an improved two-stream convolutional neural network and sparse feature fusion. In the low-rank space, because sparse features can effectively capture the information of motion objects in the video, meanwhile, we supplement the network input data, in view of the lack of information interaction in the network, we fuse the high-level semantic information and low-level detail information to recognize the motions by introducing attention mechanism, which makes the performance of the two-stream convolutional neural network have more advantages. Experimental results on UCF101 and HMDB51 data sets show that the proposed method can effectively improve the performance of motion recognition.
Key words
motion recognition, two-stream convolutional neural network, attention mechanism, sparse feature fusion, low-rank space
Digital Object Identifier (DOI)
https://doi.org/10.2298/CSIS220105043C
Publication information
Volume 19, Issue 3 (September 2022)
Year of Publication: 2022
ISSN: 2406-1018 (Online)
Publisher: ComSIS Consortium
Full text
Available in PDF
Portable Document Format
How to cite
Chen, C.: A Novel Motion Recognition Method Based on Improved Two-stream Convolutional Neural Network and Sparse Feature Fusion. Computer Science and Information Systems, Vol. 19, No. 3, 1329-1348. (2022), https://doi.org/10.2298/CSIS220105043C