Real-time Person Orientation Estimation using Colored Pointclouds
Tim Wengefeld, Benjamin Lewandowski, Daniel Seichter, Lennard Pfennig, and Horst-Michael Gross
Technische Universität Ilmenau, Germany
Robustly estimating the orientation of a person is a crucial precondition for a wide range of applications. Especially for autonomous systems, operating in populated environments, the orientation of a person can give valuable information to increase their acceptance. Given people’s orientations, mobile systems can apply navigation strategies which take people’s proxemics into account or approach them in a human like manner to perform human robot interaction (HRI) tasks. In this paper, we present an approach for person orientation estimation based on performant features from colored point clouds, formerly used for a two class person attribute classification. The classification approach has been extended to the continuous domain while treating the problem of orientation estimation in real time. We compare the performance of orientation estimation seen as a multi-class as well as a regression problem. Our fast approach achieves a mean angular error (MAE) of 15.4$^\circ$ at 14.3ms execution time and can be tuned to 12.2$^\circ$ MAE with 79.8ms execution time. This can compete with accuracies from the State-of-the-Art and even deep learning skeleton estimation approaches while retaining the real-time capability on a standard CPU.