Automated cross-modal mapping in robotic eye/hand systems using plastic radial basis function networks

Qinggang Meng, Mark Lee

Research output: Contribution to journalArticlepeer-review

18 Citations (SciVal)


Advanced autonomous artificial systems will need incremental learning and adaptive abilities similar to those seen in humans. Knowledge from biology, psychology and neuroscience is now inspiring new approaches for systems that have sensory-motor capabilities and operate in complex environments. Eye/hand coordination is an important cross-modal cognitive function, and is also typical of many of the other coordinations that must be involved in the control and operation of embodied intelligent systems. This paper examines a biologically inspired approach for incrementally constructing compact mapping networks for eye/hand coordination. We present a simplified node-decoupled extended Kalman filter for radial basis function networks, and compare this with other learning algorithms. An experimental system consisting of a robot arm and a pan-and-tilt head with a colour camera is used to produce results and test the algorithms in this paper. We also present three approaches for adapting to structural changes during eye/hand coordination tasks, and the robustness of the algorithms under noise are investigated. The learning and adaptation approaches in this paper have similarities with current ideas about neural growth in the brains of humans and animals during tool-use, and infants during early cognitive development.
Original languageEnglish
Pages (from-to)25-52
Number of pages28
JournalConnection Science
Issue number1
Publication statusPublished - 2007


  • Biologically inspired robot learning
  • Extended Kalman filter
  • Plasticity in radial-basis function networks
  • robotics


Dive into the research topics of 'Automated cross-modal mapping in robotic eye/hand systems using plastic radial basis function networks'. Together they form a unique fingerprint.

Cite this