Posture Reconstruction Using Kinect with a Probabilistic Model

Liuyang Zhou, Zhiguang Liu, Howard Leung, Hubert P. H. Shum

Research output: Contribution to conferenceOtherpeer-review

23 Citations (Scopus)


Recent work has shown that depth image based 3D posture estimation hardware such as Kinect has made interactive applications more popular. However, it is still challenging to accurately recognize postures from a single depth camera due to the inherently noisy data derived from depth images and self-occluding action performed by the user. While previous research has shown that data-driven methods can be used to reconstruct the correct postures, they usually require a large posture database, which greatly limit the usability for systems with constrained hardware such as game console. To solve this problem, we present a new probabilistic framework to enhance the accuracy of the postures live captured by Kinect. We adopt the Gaussian Process model as a prior to leverage position data obtained from Kinect and marker-based motion capture system. We also incorporate a temporal consistency term into the optimization framework to constrain the velocity variations between successive frames. To ensure that the reconstructed posture resembles the observed input data from Kinect when its tracking result is good, we embed joint reliability into the optimization framework. Experimental results demonstrate that our system can generate high quality postures even under severe self-occlusion situations, which is beneficial for real-time posture based applications such as motion-based gaming and sport training.
Original languageEnglish
Publication statusPublished - Nov 2014
EventProceedings of the 20th ACM Symposium on Virtual Reality Software and Technology - Edinburgh, UK
Duration: 1 Nov 2014 → …


ConferenceProceedings of the 20th ACM Symposium on Virtual Reality Software and Technology
Period1/11/14 → …


Dive into the research topics of 'Posture Reconstruction Using Kinect with a Probabilistic Model'. Together they form a unique fingerprint.

Cite this