Generating 3D architectural models based on hand motion and gesture

Xiao Yi, Sheng-feng Qin, Jinsheng Kang

Research output: Contribution to journalArticlepeer-review

14 Citations (Scopus)

Abstract

This paper presents a novel method for rapidly generating 3D architectural models based on hand motion and design gestures captured by a motion capture system. A set of sign language-based gestures, architectural hand signs (AHS), has been developed. AHS is performed on the left hand to define various “components of architecture”, while “location, size and shape” information is defined by the motion of Marker-Pen on the right hand. The hand gestures and motions are recognized by the system and then transferred into 3D curves and surfaces correspondingly. This paper demonstrates the hand gesture-aided architectural modeling method with some case studies.
Original languageEnglish
Pages (from-to)677-685
JournalComputers in Industry
Volume60
Issue number9
DOIs
Publication statusPublished - 2009

Keywords

  • Motion capture
  • architecture model
  • hand gesture
  • sign language
  • conceptual design

Fingerprint

Dive into the research topics of 'Generating 3D architectural models based on hand motion and gesture'. Together they form a unique fingerprint.

Cite this