Enhanced Robotic Hand-eye Coordination inspired from Human-like Behavioral Patterns

Fei Chao, Zuyuan Zhu, Chih-Min Lin, Huosheng Hu, Longzhi Yang, Changjing Shang, Changle Zhou

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)
5 Downloads (Pure)

Abstract

Robotic hand-eye coordination is recognized as an important skill to deal with complex real environments. Conventional robotic hand-eye coordination methods merely transfer stimulus signals from robotic visual space to hand actuator space. This paper introduces a reverse method: Build another channel that transfers stimulus signals from robotic hand space to visual space. Based on the reverse channel, a human-like behavior pattern: “Stop-to-Fixate”, is imparted to the robot, thereby giving the robot an enhanced reaching ability. A visual processing system inspired by the human retina structure is used to compress visual information so as to reduce the robot’s learning complexity. In addition, two constructive neural networks establish the two sensory delivery channels. The experimental results demonstrate that the robotic system gradually obtains a reaching ability. In particular, when the robotic hand touches an unseen object, the reverse channel successfully drives the visual system to notice the unseen object.
Original languageEnglish
Pages (from-to)384-396
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume10
Issue number2
Early online date21 Oct 2016
DOIs
Publication statusPublished - Jun 2018

Fingerprint Dive into the research topics of 'Enhanced Robotic Hand-eye Coordination inspired from Human-like Behavioral Patterns'. Together they form a unique fingerprint.

Cite this