Generating 3D architectural models based on hand motion and gesture

Xiao Yi, Sheng-feng Qin, Jinsheng Kang

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)


This paper presents a novel method for rapidly generating 3D architectural models based on hand motion and design gestures captured by a motion capture system. A set of sign language-based gestures, architectural hand signs (AHS), has been developed. AHS is performed on the left hand to define various “components of architecture”, while “location, size and shape” information is defined by the motion of Marker-Pen on the right hand. The hand gestures and motions are recognized by the system and then transferred into 3D curves and surfaces correspondingly. This paper demonstrates the hand gesture-aided architectural modeling method with some case studies.
Original languageEnglish
Pages (from-to)677-685
JournalComputers in Industry
Issue number9
Publication statusPublished - 2009

Cite this