Multisensory representation of situations

Main investigator:
Prof. Dr. Giorgio Metta
Fondazione Istituto Italiano Di Tecnologia

Collaboration partners:

  • Universität Hamburg
  • Fraunhofer-Institut für Produktionstechnik und Automatisierung IPA

Competence Area: Situation


Objectives

To develop a complex body representation for the iCub robot utilizing visual, tactile, force and motor signals. This representation enables control of the robot in the presence of multiple potential or actual contacts. It will e.g. determine that certain configurations are convenient to support the robot during task execution and actively seek contact with the environment rather than avoiding it. This model will be also compared with an existing neuroscience model of the so-called peripersonal space representation.


Expected Results

The main result will be a set of software libraries that contain learning methods (e.g. probability density estimation, regression, etc.) and sensory processing (e.g. optical flow estimation, stereo vision, etc.). These algorithms will jointly contribute to the acquisition of the body and peripersonal space representation.