Francois FOERSTER is a Marie-Curie PhD fellow at the University of Plymouth (UK), where he developed an original research paradigm combining EEG recordings and Virtual Reality, simulating our everyday interaction with objects.
He captures in real-time the neuronal activities responsible of our ability to grasp and use objects.
His research is focused on the Cognitive Neuroscience of Perception and Action: How assemblies of neurons synchronise together during object recognition and motor control.
He holds a BSc in Psychology from the University of Strasbourg (France) and a MSc in Cognitive Science from the Grenoble Institute of Technology (France).
His previous research experience was focused on spatial memory in humans and rats (LPNC, Grenoble / LNCA, Strasbourg)
He also worked on human-robot interaction (HRI) for non-verbal communication using the iCub (GIPSA-lab, Grenoble)
Paper (Humanoids 2015) available: https://hal.archives-ouvertes.fr/hal-01228887/document
I conduct human neurocognitive examinations (EEG) of the processes involved in effector-object interaction, leading to enhance cognitive architectures for object manipulation in robots.
In embodied models of cognition, our representations of objects are formed around the motor programs used to manipulate them. This means that not only do we automatically prepare relevant actions when viewing objects, but also automatically recollect object knowledge derived from our everyday experience.
Our next generation of robots should also benefit of such cognitive capacities to solve common issues concerning object manipulation and demonstrate human-like sensorimotor skills.
More generally, my work describes new perspectives about how and which neuronal mechanisms provide our fast and efficient ability to manipulate objects.
To study how humans process objects and prepare motor interaction, I propose an original paradigm combining ElectroEncephaloGram (EEG) and Virtual Reality (VR). On one hand, EEG allows me to compare different brain activities representing how objects are processed (eg. different shapes, sizes, location in space, etc.) and how different actions are performed (eg. grasping and moving an object, using a tool). On the other hand, VR allows me to create these objects to manipulate (as Blender 3D models), in order to control their novelty. Finally, it's also provide an easy way to reproduce the experiments and to record the kinematics of the controller in the space during these object-based actions.
First study: Object knowledge are extracted during visual processing, not motor planning.
How human's brain implement different object-based actions, as reaching, grasping and moving or using a tool?
Our EEG results suggest that semantic properties of objects (e.g. the function of a tool) are accessed during object viewing rather than motor preparation.
This also means that the brain recollects object knowledge by default, independently of the motor task.
Study presented in:
BACN 2017: National Conference of the British Association of Cognitive Neuroscience, 7-8 September 2017, Plymouth University, UK.
CuttingEEG 2017: 3rd Symposium on cutting-edge methods for EEG research, June 19-22, Glasgow, UK.
Second study: The role of the mu and beta neural rhythms in the extraction of motor and semantic knowledge about objects
Our EEG analyses suggest that the mu and beta frequency bands have different roles in the extraction of motor information and semantic information about object for action.
Whereas the mu rhythm (8-13 Hz) appears to process motor information ("How" to manipulate an object), the beta rhythm (14-30 Hz) seems to process only the semantic properties about an object ("What" is the common purpose of the object)
In the video, we depict our experiment where participants learnt "How" to use a novel tool (performing a key-like manipulation) and "What" is the function of this novel tool (opening a box).
Study presented in:
Movement & Cognition 2018: International Conference Movement & Cognition (Brain, Body, Cognition), 27-29 July 2018, Harvard Medical School, Boston, USA
Third study: The role of neuronal rhythms in object perception and the selection of object-based actions
How our neuronal activities reflects our everyday ability to light up our cigarette with lighters or opening our doors with keys?
Our current experiment evaluates the neuronal mechanisms responsible for the selection of two tool use (lighting up a candle and opening a chest) associated with novel tools.
Our time-frequency analyses derived from EEG recordings suggested that motor and semantic content during object perception and action selection rely on distinct frequency bands.
Whereas motor and semantic learnt information about objects are extracted through fast Mu (8-13 Hz) and Beta (14-30 Hz) frequency bands, selecting which action associated with these objects to perform rely on slow Delta (1-4 Hz) and Theta (4-8 Hz) frequency bands.
To provide a new understanding of interactive object manipulation through the examination of the modulation of action affordances when objects are shared and manipulated between multiple agents, both human and robot. Electrophysiological experiments will examine the neural activity associated with action affordances during interactive object use, which will inform software models for robot object manipulation and human-robot collaboration.