Inverse dynamics based on occlusion-resistant Kinect data: Is it usable for ergonomics?

Pierre Plantard, Antoine Muller, Charles Pontonnier, Georges Dumont, Hubert P. H. Shum, Franck Multon

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)
6 Downloads (Pure)

Abstract

Joint torques and forces are relevant quantities to estimate the biomechanical constraints of working tasks in ergonomics. However, inverse dynamics requires accurate motion capture data, which are generally not available in real manufacturing plants. Markerless and calibrationless measurement systems based on depth cameras, such as the Microsoft Kinect, are promising means to measure 3D poses in real time. Recent works have proposed methods to obtain reliable continuous skeleton data in cluttered environments, with occlusions and inappropriate sensor placement. In this paper, we evaluate the reliability of an inverse dynamics method based on this corrected skeleton data and its potential use to estimate joint torques and forces in such cluttered environments. To this end, we compared the calculated joint torques with those obtained with a reference inverse dynamics method based on an optoelectronic motion capture system. Results show that the Kinect skeleton data enabled the inverse dynamics process to deliver reliable joint torques in occlusion-free (r = 0.99 for the left shoulder elevation) and occluded (r = 0.91 for the left shoulder elevation) environments. However, differences remain between joint torques estimations. Such reliable joint torques open appealing perspectives for the use of new fatigue or solicitation indexes based on internal efforts measured on site.
Original languageEnglish
Pages (from-to)71-80
JournalInternational Journal of Industrial Ergonomics
Volume61
Early online date5 Jun 2017
DOIs
Publication statusPublished - Sep 2017

Fingerprint Dive into the research topics of 'Inverse dynamics based on occlusion-resistant Kinect data: Is it usable for ergonomics?'. Together they form a unique fingerprint.

Cite this