Francois Foerster

Neurophysiological Assessment of Object-Based Actions in Humans

Principle Supervisor: 
Dr. Jeremy Goslin
University of Plymouth

Collaboration partners:

  • Fraunhofer-Institut für Produktionstechnik und Automatisierung IPA
  • Telerobot Labs

Competence Area: Situation

Objectives

To conduct a neurocognitive examination of the processes of object-based action affordances to inform interactive models of human-robot object manipulation. In embodied models of cognition our representations of objects are formed around the motor programs used to manipulate them. This means that not only do we automatically prepare relevant actions when viewing objects, but also that our actions modulate our perception of our environment and those interacting within it. Robots with a similar embodied architecture should also benefit from a more seamless sensory-motor integration.

 

 

Current work

To study how humans process objects and prepare motor interaction, I propose an original paradigm combining ElectroEncephaloGram (EEG) and Virtual Reality (VR). On one hand, EEG allows me to compare different brain activities representing how objects are processed (eg. different shapes, sizes, location in space, etc.) and how different actions are performed (eg. grasping and moving an object, using a tool). On the other hand, VR allows me to create these objects to manipulate (as Blender 3D models), in order to control their novelty. Finally, it's also provide an easy way to reproduce the experiments and to record the kinematics of the controller in the space during these object-based actions.

A first study

How human's brain implement different object-based actions, as reaching, grasping and moving or using a tool?
https://www.youtube.com/watch?v=Y20SEX14Az4

Expected Results

To provide a new understanding of interactive object manipulation through the examination of the modulation of action affordances when objects are shared and manipulated between multiple agents, both human and robot. Electrophysiological experiments will examine the neural activity associated with action affordances during interactive object use, which will inform software models for robot object manipulation and human-robot collaboration.