Gaze modulated visual search integrating spatial and feature data: Embodied visual memory for robotic systems II

Martin Hülse, Sebastian McBride, Mark Lee

Allbwn ymchwil: Cyfraniad at gynhadleddPosteradolygiad gan gymheiriaid

Crynodeb

Substantial evidence supports the role of lateral intraparietal region (LIP) of the brain as the central processing point where bottom-up visual information is modulated by top-down task information from higher cortical structures. It also contains a global egocentric as opposed to a local retinotopic mapping and thus is also considered critical for the accumulation of a coherent view of the surrounding environment in the context of an ever changing visual scene. We have developed an active vision system architecture based on the LIP structure as its central element. This architecture, as an extension of that previously presented, now considers feature data and has the ability to modulate visual search according to specific object properties. This architecture is discussed in terms of its ability to generate visual search for active robotic vision systems.
Iaith wreiddiolSaesneg
Tudalennau167-168
Nifer y tudalennau2
StatwsCyhoeddwyd - Tach 2010

Ôl bys

Gweld gwybodaeth am bynciau ymchwil 'Gaze modulated visual search integrating spatial and feature data: Embodied visual memory for robotic systems II'. Gyda’i gilydd, maen nhw’n ffurfio ôl bys unigryw.

Dyfynnu hyn