PRIN2022 - MulWalk

Multisensory integration of locomotion-related visual and somatomotor signals

 

Multisensory integration is essential to move around in an adaptive manner. Specifically, during walking, proprioceptive signals sent from the lower limbs need to be merged with the optic flow to visually guide locomotion through the environment. The multisensory integration is conceptually linked to the ‘magnitude’ of sensory stimuli, i.e. the perception of their extension along the dimensions of space and time. The magnitude of stimuli has a powerful influence in shaping our predictions in everyday life that in turn shape our behavior. However, although the multisensory integration topic is extensively investigated in neuroscience, where multisensory integration occurs and how is orchestrated in the brain as a whole remains largely unexplored.
The aim of MulWalk is to fill this gap, studying the multisensory integration for space and time from a completely new perspective: looking at the interactions between different parameters with respect to a broader concept of moving that includes real walking and hand actions within the environment. To this, we will develop a multidisciplinary approach that combines human and macaque data and the use of cutting-edge methods including psychophysics, brain imaging, neural recordings, virtual reality stimulation, and modern computational approaches like deep learning-based models. Functional MRI analyses, neural recordings and fMRI-adaptation paradigms will investigate the neural bases of spatial and temporal multisensory integration in human and macaque brain and to identify cortical regions able to integrate visual ego-motion and proprioceptive signals during locomotion. Psychophysics experiments on human subjects will be developed to assess how the perception of visual parameters, such as size and distance, interacts with walking, hand and foot actions. Deep learning approaches will be used to build predictive models that infer perceptual and motor features from neural activity and translate measures of neural activity in humans and macaques into a common feature space, promoting a better comparison of the neural multisensory representations across the two species and of their sensory and behavioural predictive power. This multidisciplinary approach will allow to identify the neural, behavioral, and computational effects of multisensory integration at different levels of processing that occur during real and ecological whole-body action like walking. Given the dramatic effects of degenerative diseases on daily activities that involve self-navigation and visuomotor interactions with objects and other people, another action of MulWalk is to explore for potential clinical and technological applications. In line with the Strategic Objectives of the PNR 21-27, the knowledge gathered from behavioral and neural observations in healthy participants will lead brain-damaged patients with attentional or motor (gait) impairments to explore the use of virtual reality as a tool for gait rehabilitation.

 

Scientific coordinator: Prof. Patrizia Fattori (Universita' di Bologna)

PI of Unit: Prof. Sabrina Pitzalis (Università degli Studi di ROMA "Foro Italico")