Human action recognition based on spatiotemporal features from videos
Abstract
Currently, there is a high demand for the development of new techniques for automatic pattern recognition in videos, for example for the automatic recognition of human actions, this demand is motivated by the advances in the technologies of production, storage, transmission and sharing of videos, such advances triggered the production of a huge volume of videos that need to be automatically processed to be useful. Among the main applications, we can highlight: surveillance in public places, detection of falls of the elderly in their homes, automation in no-checkout-required stores, detection of pedestrian actions by self-driving car, detection of inappropriate content posted on the Internet like violence or pornography, etc. The automatic recognition of actions in videos is a challenging task because, in order to obtain good classification rates, it is necessary to work with spatial information (for example, shapes found in a single frame of the video) and temporal information (for example, movement patterns found throughout the frames in the video). In this thesis new methods are proposed for automatic recognition of human actions based on spatiotemporal features extracted from videos. Initially, different architectures of 3D Convolution Neural Networks (CNNs) were evaluated in the context of detecting pornography in videos. Afterwards, new methods were proposed for the recognition of human actions based on spatiotemporal information extracted from 2D poses. The use of 2D poses proved to be a promising strategy, as it requires a lower computational cost when compared to techniques that use deep learning. Besides, by using 2D poses, instead of raw images, one can preserve the privacy of people and places where the video cameras are installed. The proposed method has presented accuracy rates compatible with the state-of-the-art rates on the public databases in which the experiments were carried out.
Collections
The following license files are associated with this item: