Paper
17 March 2017 Predicting human activities in sequences of actions in RGB-D videos
David Jardim, Luís Nunes, Miguel Dias
Author Affiliations +
Proceedings Volume 10341, Ninth International Conference on Machine Vision (ICMV 2016); 103410C (2017) https://doi.org/10.1117/12.2268524
Event: Ninth International Conference on Machine Vision, 2016, Nice, France
Abstract
In our daily activities we perform prediction or anticipation when interacting with other humans or with objects. Prediction of human activity made by computers has several potential applications: surveillance systems, human computer interfaces, sports video analysis, human-robot-collaboration, games and health-care. We propose a system capable of recognizing and predicting human actions using supervised classifiers trained with automatically labeled data evaluated in our human activity RGB-D dataset (recorded with a Kinect sensor) and using only the position of the main skeleton joints to extract features. Using conditional random fields (CRFs) to model the sequential nature of actions in a sequence has been used before, but where other approaches try to predict an outcome or anticipate ahead in time (seconds), we try to predict what will be the next action of a subject. Our results show an activity prediction accuracy of 89.9% using an automatically labeled dataset.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
David Jardim, Luís Nunes, and Miguel Dias "Predicting human activities in sequences of actions in RGB-D videos", Proc. SPIE 10341, Ninth International Conference on Machine Vision (ICMV 2016), 103410C (17 March 2017); https://doi.org/10.1117/12.2268524
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Video

Feature extraction

Binary data

Video surveillance

Computing systems

Data modeling

Sensors

Back to Top