Image processing and gesture recognition in human action-outcome learning experiments

Olivier Mangin, Stefano Toxiri, Luca Lonini

    Research output: Contribution to conferencePaper

    12 Downloads (Pure)

    Abstract

    Microsoft Kinect provides an off-the-shelf sensor that can be used to reliably capture information from body movements in real-time fashion. We implemented an on-line gesture recognition system on top of the kinect's hand tracking capabilities. The system is able to perform real time classification of the user hand gestures by comparing the current movement to a set of 9 predefined template gestures. Gestures are detected when the moving hand exceeds a threshold speed for a minimum duration.The ultimate goal of this work is to study action-outcome learning in humans: how does a person figure out what actions he can make that have an effect on the environment? How does he shape a gesture to produce this outcome? To this purpose, we improved the recognition algorithm by allowing the dictionary of template gestures to adapt according to the way the user performs the gestures. This allows the emergence of a shared representation of each gesture between the human and the computer, while the user interacts with the system. The approach opens new perspectives in designing and studying interactions between humans and machines as well as in studies of how motor-impaired patients interact with the system.
    Original languageEnglish
    Number of pages7
    Publication statusPublished - 26 Sept 2011
    EventCapo Caccia Cognitive Neuromorphic Engineering Workshop - Aberystwyth, United Kingdom of Great Britain and Northern Ireland
    Duration: 01 May 201107 May 2011

    Conference

    ConferenceCapo Caccia Cognitive Neuromorphic Engineering Workshop
    Country/TerritoryUnited Kingdom of Great Britain and Northern Ireland
    CityAberystwyth
    Period01 May 201107 May 2011

    Fingerprint

    Dive into the research topics of 'Image processing and gesture recognition in human action-outcome learning experiments'. Together they form a unique fingerprint.

    Cite this