Visual-guided Robotic Object Grasping using Dual Neural Network Controllers

Wubing Fang, Fei Chao*, Chih-Min Lin, Dajun Zhou, Longzhi Yang, Xiang Chang, Qiang Shen, Changjing Shang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

16 Citations (Scopus)
52 Downloads (Pure)


It has been a challenging task for a robotic arm to accurately reach and grasp objects, which has drawn much research attention. This article proposes a robotic hand–eye coordination system by simulating the human behavior pattern to achieve a fast and robust reaching ability. This is achieved by two neural-network-based controllers, including a rough reaching movement controller implemented by a pretrained radial basis function for rough reaching movements, and a correction movement controller built from a specifically designed brain emotional nesting network (BENN) for smooth correction movements. In particular, the proposed BENN is designed with high nonlinear mapping ability, with its adaptive laws derived from the Lyapunov stability theorem; from this, the robust tracking performance and accordingly the stability of the proposed control system are guaranteed by the utilization of the H∞ control approach. The proposed BENN is validated and evaluated by a chaos synchronization simulation, and the overall control system by object grasping tasks through a physical robotic arm in a real-world environment. The experimental results demonstrate the superiority of the proposed control system in reference to those with single neural networks.
Original languageEnglish
Article number9095225
Pages (from-to)2282-2291
Number of pages10
JournalIEEE Transactions on Industrial Informatics
Issue number3
Early online date18 May 2020
Publication statusPublished - Mar 2021


Dive into the research topics of 'Visual-guided Robotic Object Grasping using Dual Neural Network Controllers'. Together they form a unique fingerprint.

Cite this