BackgroundHumans often use co-speech gestures to promote effective communication. Attention has been paid to the cortical areas engaged in the processing of co-speech gestures.AimsTo investigate the neural network underpinned in the processing of co-speech gestures and to observe whether there is a relationship between areas involved in language and gesture processing.Methods & ProceduresWe planned to include studies with neurotypical and/or stroke participants who underwent a bimodal task (i.e., processing of co-speech gestures with relative speech) and a unimodal task (i.e., speech or gesture alone) during a functional magnetic resonance imaging (fMRI) session. After a database search, abstract and full-text screening were conducted. Qualitative and quantitative data were extracted, and a meta-analysis was performed with the software GingerALE 3.0.2, performing contrast analyses of uni- and bimodal tasks.Main ContributionThe database search produced 1024 records. After the screening process, 27 studies were included in the review. Data from 15 studies were quantitatively analysed through meta-analysis. Meta-analysis found three clusters with a significant activation of the left middle frontal gyrus and inferior frontal gyrus, and bilateral middle occipital gyrus and inferior temporal gyrus.ConclusionsThere is a close link at the neural level for the semantic processing of auditory and visual information during communication. These findings encourage the integration of the use of co-speech gestures during aphasia treatment as a strategy to foster the possibility to communicate effectively for people with aphasia.

Language and gesture neural correlates: A meta-analysis of functional magnetic resonance imaging studies

Salvalaggio S.;
2023

Abstract

BackgroundHumans often use co-speech gestures to promote effective communication. Attention has been paid to the cortical areas engaged in the processing of co-speech gestures.AimsTo investigate the neural network underpinned in the processing of co-speech gestures and to observe whether there is a relationship between areas involved in language and gesture processing.Methods & ProceduresWe planned to include studies with neurotypical and/or stroke participants who underwent a bimodal task (i.e., processing of co-speech gestures with relative speech) and a unimodal task (i.e., speech or gesture alone) during a functional magnetic resonance imaging (fMRI) session. After a database search, abstract and full-text screening were conducted. Qualitative and quantitative data were extracted, and a meta-analysis was performed with the software GingerALE 3.0.2, performing contrast analyses of uni- and bimodal tasks.Main ContributionThe database search produced 1024 records. After the screening process, 27 studies were included in the review. Data from 15 studies were quantitatively analysed through meta-analysis. Meta-analysis found three clusters with a significant activation of the left middle frontal gyrus and inferior frontal gyrus, and bilateral middle occipital gyrus and inferior temporal gyrus.ConclusionsThere is a close link at the neural level for the semantic processing of auditory and visual information during communication. These findings encourage the integration of the use of co-speech gestures during aphasia treatment as a strategy to foster the possibility to communicate effectively for people with aphasia.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3501753
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact