TY - JOUR
T1 - Visual-guided Robotic Object Grasping using Dual Neural Network Controllers
AU - Fang, Wubing
AU - Chao, Fei
AU - Lin, Chih-Min
AU - Zhou, Dajun
AU - Yang, Longzhi
AU - Chang, Xiang
AU - Shen, Qiang
AU - Shang, Changjing
N1 - Funding Information:
Manuscript received January 30, 2020; revised April 15, 2020; accepted May 10, 2020. Date of publication May 18, 2020; date of current version November 20, 2020. This work was supported in part by the Fundamental Research Funds for the Central Universities under Grant 20720190142, in part by the National Natural Science Foundation of China under Grant 61673322, Grant 61673326, and Grant 91746103, and in part by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Sklodowska-Curie under Grant 663830. Paper no. TII-20-0448. (Corresponding author: Fei Chao.) Wubing Fang is with the Department of Artificial Intelligence, School of Informatics, Xiamen University, Xiamen 361005, China (e-mail: [email protected]).
Publisher Copyright:
© 2005-2012 IEEE.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2021/3
Y1 - 2021/3
N2 - It has been a challenging task for a robotic arm to accurately reach and grasp objects, which has drawn much research attention. This article proposes a robotic hand–eye coordination system by simulating the human behavior pattern to achieve a fast and robust reaching ability. This is achieved by two neural-network-based controllers, including a rough reaching movement controller implemented by a pretrained radial basis function for rough reaching movements, and a correction movement controller built from a specifically designed brain emotional nesting network (BENN) for smooth correction movements. In particular, the proposed BENN is designed with high nonlinear mapping ability, with its adaptive laws derived from the Lyapunov stability theorem; from this, the robust tracking performance and accordingly the stability of the proposed control system are guaranteed by the utilization of the H∞ control approach. The proposed BENN is validated and evaluated by a chaos synchronization simulation, and the overall control system by object grasping tasks through a physical robotic arm in a real-world environment. The experimental results demonstrate the superiority of the proposed control system in reference to those with single neural networks.
AB - It has been a challenging task for a robotic arm to accurately reach and grasp objects, which has drawn much research attention. This article proposes a robotic hand–eye coordination system by simulating the human behavior pattern to achieve a fast and robust reaching ability. This is achieved by two neural-network-based controllers, including a rough reaching movement controller implemented by a pretrained radial basis function for rough reaching movements, and a correction movement controller built from a specifically designed brain emotional nesting network (BENN) for smooth correction movements. In particular, the proposed BENN is designed with high nonlinear mapping ability, with its adaptive laws derived from the Lyapunov stability theorem; from this, the robust tracking performance and accordingly the stability of the proposed control system are guaranteed by the utilization of the H∞ control approach. The proposed BENN is validated and evaluated by a chaos synchronization simulation, and the overall control system by object grasping tasks through a physical robotic arm in a real-world environment. The experimental results demonstrate the superiority of the proposed control system in reference to those with single neural networks.
KW - Neural-network-based controller
KW - robotic hand-eye coordination
KW - robotic reaching movement
UR - http://www.scopus.com/inward/record.url?scp=85097710486&partnerID=8YFLogxK
U2 - 10.1109/tii.2020.2995142
DO - 10.1109/tii.2020.2995142
M3 - Article
AN - SCOPUS:85097710486
SN - 1551-3203
VL - 17
SP - 2282
EP - 2291
JO - IEEE Transactions on Industrial Informatics
JF - IEEE Transactions on Industrial Informatics
IS - 3
M1 - 9095225
ER -