A recurrent emotional CMAC neural network controller for vision-based mobile robots

Wubing Fang, Fei Chao, Longzhi Yang, Chih-Min Lin, Changjing Shang, Changle Zhou, Qiang Shen

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)
8 Downloads (Pure)

Abstract

Vision-based mobile robots often suffer from the difficulties of high nonlinear dynamics and precise positioning requirements, which leads to the development demand of more powerful nonlinear approximation in controlling and monitoring of mobile robots. This paper proposes a recurrent emotional cerebellar model articulation controller (RECMAC) neural network in meeting such demand. In particular, the proposed network integrates a recurrent loop and an emotional learning mechanism into a cerebellar model articulation controller (CMAC), which is implemented as the main component of the controller module of a vision-based mobile robot. Briefly, the controller module consists of a sliding surface, the RECMAC, and a compensator controller. The incorporation of the recurrent structure in a slide model neural network controller ensures the retaining of the previous states of the robot to improve its dynamic mapping ability. The convergence of the proposed system is guaranteed by applying the Lyapunov stability analysis theory. The proposed system was validated and evaluated by both simulation and a practical moving-target tracking task. The experimentation demonstrated that the proposed system outperforms other popular neural network-based control systems, and thus it is superior in approximating highly nonlinear dynamics in controlling vision-based mobile robots.
Original languageEnglish
Pages (from-to)227-238
Number of pages12
JournalNeurocomputing
Volume334
Early online date22 Jan 2019
DOIs
Publication statusPublished - 21 Mar 2019

Fingerprint

Dive into the research topics of 'A recurrent emotional CMAC neural network controller for vision-based mobile robots'. Together they form a unique fingerprint.

Cite this