Present smart phones contain high-tech sensors to monitor three-dimensional movements of the device and users' behaviours. These sensors allow mobile devices to recognize motion gestures. However, only a few gesture sets have been created, and little is known about best practices in motion-gesture design. Also, the created gesture sets were generated from people who do not have to use their motion gestures very much in their daily lives, not from people with communication difficulties. To address this issue, we use a focus group that has dyslexia and other specific learning difficulties for designing the user-defined gesture sets. This paper presents the results of our study that elicits the focus group's gestures to invoke commands on a smart-phone device. It demonstrates how the gesture sets have been designed and finalised throughout other research activities, such as observation and interviews. Finally, we suggest that our result would help people with communication impairments conveniently interact with others in an intuitive and socially acceptable manner.
|Title of host publication||Proceedings of 2011 17th International Conference on Automation and Computing (ICAC)|
|Place of Publication||Piscataway, NJ|
|Publication status||Published - 10 Sep 2011|
|Event||2011 17th International Conference on Automation and Computing (ICAC) - Huddersfield|
Duration: 10 Sep 2011 → …
|Conference||2011 17th International Conference on Automation and Computing (ICAC)|
|Period||10/09/11 → …|