This paper proposes a novel approach for the body pose recognition of multiple persons. Our system takes as input the 3D joint locations of each person's skeleton representation, as estimated by OpenPTrack, an open source project for RGB-D people tracking. The poses of the upper and lower limbs are computed separately by comparing them to the ones stored in a pre-recorded database. The two partial poses are then combined to obtain the full pose of each person on the scene. The system provides real-time outcomes, is markerless, and does not need any assumption on the orientation, initial position, or number of persons on the scene. It can be used as a base for more complex action recognition algorithms, for intelligent surveillance and security devices, or in human-computer interaction.

A Limb-based Approach for Body Pose Recognition Using a Predefined Set of Poses

GUIDOLIN, MATTIA;Marco Carraro;Stefano Ghidoni;Emanuele Menegatti
2018

Abstract

This paper proposes a novel approach for the body pose recognition of multiple persons. Our system takes as input the 3D joint locations of each person's skeleton representation, as estimated by OpenPTrack, an open source project for RGB-D people tracking. The poses of the upper and lower limbs are computed separately by comparing them to the ones stored in a pre-recorded database. The two partial poses are then combined to obtain the full pose of each person on the scene. The system provides real-time outcomes, is markerless, and does not need any assumption on the orientation, initial position, or number of persons on the scene. It can be used as a base for more complex action recognition algorithms, for intelligent surveillance and security devices, or in human-computer interaction.
2018
Workshop Proceedings of the 15th International Conference on Intelligent Autonomous Systems
978-3-00-059946-0
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3276600
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact