Frequency-specific directed interactions between whole-brain regions during sentence processing using multimodal stimulus

  • Changfu Pei
  • , Xunan Huang
  • , Yuan Qiu
  • , Yueheng Peng
  • , Shan Gao
  • , Bharat Biswal
  • , Dezhong Yao
  • , Qiang Liu
  • , Fali Li
  • , Peng Xu

Research output: Contribution to journalArticlepeer-review

Abstract

Neural oscillations subserve a broad range of speech processing and language comprehension functions. Using an electroencephalogram (EEG), we investigated the frequency-specific directed interactions between whole-brain regions while the participants processed Chinese sentences using different modality stimuli (i.e., auditory, visual, and audio-visual). The results indicate that low-frequency responses correspond to the process of information flow aggregation in primary sensory cortices in different modalities. Information flow dominated by high-frequency responses exhibited characteristics of bottom-up flow from left posterior temporal to left frontal regions. The network pattern of top-down information flowing out of the left frontal lobe was presented by the joint dominance of low- and high-frequency rhythms. Overall, our results suggest that the brain may be modality-independent when processing higher-order language information.

Original languageEnglish (US)
Article number137409
JournalNeuroscience Letters
Volume812
DOIs
StatePublished - Aug 24 2023

All Science Journal Classification (ASJC) codes

  • General Neuroscience

Keywords

  • Audio-visual integration
  • Brain network
  • EEG
  • Granger causality
  • Neural oscillations

Fingerprint

Dive into the research topics of 'Frequency-specific directed interactions between whole-brain regions during sentence processing using multimodal stimulus'. Together they form a unique fingerprint.

Cite this