Dressing virtual humans from 3D scanned data

Hui Yu*, Shengfeng Qin, David Wright

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a new approach to dressing three-dimensional virtual human models from scanned data. As the first step in this process, an algorithm is developed to obtain key body features and a reference skeleton. Due to the difference in body shape among various people, especially different ethnicities, we propose a two-step method for finding body feature points. Firstly, body feature regions are located through some pre-defined parameters defined by the proportion of head to body height. We adopt the idea of describing the body height as eight-heads tall. Secondly, body features are extracted within their corresponding regions. In the second step, garment patterns are constructed based on body features and the reference skeleton described above using two methods. One is a redial offset method the other is a swept surface method.

Original languageEnglish
Pages (from-to)316-321
Number of pages6
JournalBiomedical Sciences Instrumentation
Volume44
Publication statusPublished - 8 May 2008
Externally publishedYes

Keywords

  • 3D scanned data
  • Dressing virtual humans
  • Garment generation
  • Semantic human model

Fingerprint

Dive into the research topics of 'Dressing virtual humans from 3D scanned data'. Together they form a unique fingerprint.

Cite this