Abstract
Gestures are both natural and intuitive for Human-Computer-Interaction (HCI) and the one-shot learning scenario is one of the real world situations in terms of gesture recognition problems. In this demo, we present a hand gesture recognition system using the Kinect sensor, which addresses the problem of one-shot learning gesture recognition with a user-defined training and testing system. Such a system can behave like a remote control where the user can allocate a specific function using a prefered gesture by performing it only once. To adopt the gesture recognition framework, the system first automatically segments an action sequence into atomic tokens, and then adopts the Extended-Motion-History-Image (Extended-MHI) for motion feature representation. We evaluate the performance of our system quantitatively in Chalearn Gesture Challenge, and apply it to a virtual one shot learning gesture recognition system.
Original language | English |
---|---|
DOIs | |
Publication status | Published - Nov 2012 |
Event | ACMMM 2012 - 20th Anniversary ACM Multimedia Conference - Nara, Japan Duration: 1 Nov 2012 → … |
Conference
Conference | ACMMM 2012 - 20th Anniversary ACM Multimedia Conference |
---|---|
Period | 1/11/12 → … |
Keywords
- One Shot Learning
- Hand Gesture Recognition
- Human-Computer-Interaction
- RGBD Camera