User-defined gesture sets using a mobile device for people with communication difficulties

Haoyun Xue, Sheng-feng Qin

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

1 Citation (Scopus)

Abstract

Present smart phones contain high-tech sensors to monitor three-dimensional movements of the device and users' behaviours. These sensors allow mobile devices to recognize motion gestures. However, only a few gesture sets have been created, and little is known about best practices in motion-gesture design. Also, the created gesture sets were generated from people who do not have to use their motion gestures very much in their daily lives, not from people with communication difficulties. To address this issue, we use a focus group that has dyslexia and other specific learning difficulties for designing the user-defined gesture sets. This paper presents the results of our study that elicits the focus group's gestures to invoke commands on a smart-phone device. It demonstrates how the gesture sets have been designed and finalised throughout other research activities, such as observation and interviews. Finally, we suggest that our result would help people with communication impairments conveniently interact with others in an intuitive and socially acceptable manner.
Original languageEnglish
Title of host publicationProceedings of 2011 17th International Conference on Automation and Computing (ICAC)
Place of PublicationPiscataway, NJ
PublisherIEEE
Pages34-39
ISBN (Print)9781467300001
Publication statusPublished - 10 Sep 2011
Event2011 17th International Conference on Automation and Computing (ICAC) - Huddersfield
Duration: 10 Sep 2011 → …

Conference

Conference2011 17th International Conference on Automation and Computing (ICAC)
Period10/09/11 → …

Fingerprint

Dive into the research topics of 'User-defined gesture sets using a mobile device for people with communication difficulties'. Together they form a unique fingerprint.

Cite this