ISign: An architecture for humanoid assisted sign language tutoring

Hatice Kose, Neziha Akalin, Rabia Yorganci, Bekir S. Ertugrul, Hasan Kivrak, Semih Kavak, Ahmet Ozkul, Cemal Gurpinar, Pinar Uluer, Gökhan Ince

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)


This paper investigates the role of interaction and communication kinesics in human-robot interaction. It is based on a project on Sign Language (SL) tutoring through interaction games with humanoid robots. The aim of the study is to design a computational framework, which enables to motivate the children with communication problems (i.e., ASD and hearing impairments) to understand and imitate the signs implemented by the robot using the basic upper torso gestures and sound in a turn-taking manner. This framework consists of modular computational components to endow the robot the capability of perceiving the actions of the children, carrying out a game or storytelling task and tutoring the children in any desired mode, i.e., supervised and semisupervised. Visual (colored cards), vocal (storytelling, music), touch (using tactile sensors on the robot to communicate), and motion (recognition and implementation of gestures including signs) based cues are proposed to be used for a multimodal communication between the robot, child and therapist/parent. We present an empirical and exploratory study investigating the effect of basic non-verbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the child will give relevant feedback in SL or visually to the robot, according to the context of the game.

Original languageEnglish
Pages (from-to)157-184
Number of pages28
JournalSpringer Tracts in Advanced Robotics
Publication statusPublished - 2015
Externally publishedYes

Cite this