Abstract—In this work, we present an accurate 3D human
pose recognition (HPR) work via multi-sensor fusion. Lately, 3D
HPR is widely performed using a depth imaging sensor, but this
approach has limitations: 1) orientations of body parts cannot be
accurately recognized and 2) it suffers from occlusion.
To achieve an accurate and stable recognition of human poses
in real-time, in this study, we propose to use inertial
measurement units (IMUs) which are used to estimate the
orientation of body limbs and solve the occlusion problem. Via
fusion of depth and IMU sensors, our results demonstrate
significantly improved 3D human pose reconstruction: our
results show the accurate recognition of twist and location of the
arms even under occlusion. Our presented approach could be
critical if 3D HPR is to be used for medical applications such as
musculoskeletal analysis via in 3D as demonstrated in this study.
Index Terms—Human pose recognition, depth sensors,
inertial measurement units, sensor fusion, musculoskeletal
analysis.
S. B. Nam, S. U. Park, J. H. Park, and T. S. Kim are with the Department
of Biomedical Engineering, Kyung Hee University, Seocheon-Dong,
Giheung-Gu, Yongin-Si, Gyeonggi-Do, 446-701, Republic of Korea (e-mail:
sbnam@khu.ac.kr, supark@khu.ac.kr, psmt2655@khu.ac.kr,
tskim@khu.ac.kr).
M. D. Zia Uddin is with the Department of Computer Education,
Sungkyunkwan University, Jongno-Gu, Seoul, Republic of Korea (e-mail:
ziauddin@skku.edu).
[PDF]
Cite: S. B. Nam, S. U. Park, J. H. Park, M. D. Zia Uddin, and T. S. Kim, "Accurate 3D Human Pose Recognition via Fusion of Depth and Motion Sensors," International Journal of Future Computer and Communication vol. 4, no. 5, pp. 336-340, 2015.