Spatio-temporal patterns of brain activation reflect embodied action-word semantics
Neuroscientific theories of embodied semantics make strong predictions about brain activation patterns evoked by words. If understanding a word referring to an object or action is grounded in perceptual or motor experience, then the corresponding visual or motor areas of the brain should be activated in the process. More specifically, different cortical motor areas should be activated depending on the effector of the action (e.g. hand, leg or mouth). I will start with a selective review of functional magnetic resonance imaging (fMRI) studies that have confirmed these expectations. For example, stimuli referring to different bodily effectors (e.g. arm-, leg- or face-related words such as “pick”, “kick” or “lick”) produced somatotopic activation patterns in motor cortex. The findings have been extended to more specific categories of action-words, e.g. uni-manual action-words (“throw”) produced more strongly lateralized activation in motor cortex than bi-manual action-words (“clap”). fMRI methodology integrates brain activity over several seconds, and is therefore not able to distinguish between early retrieval and late post-translational processes. In order to address this problem, we showed that activation that is specific to action-words or object-words correlates negatively with word frequency, suggesting that this activation reflects retrieval of lexico-semantic information, rather than mental imagery. A more direct test, however, is to use electro- and magnetoencephalography (EEG, MEG) in order to measure brain activation with sufficient time resolution to distinguish early from late processes. In an ERP study, we found that visually presented arm-, leg- and face-related words could be distinguished as early as 210-230 ms after word onset, and that the putative generators of these signals showed a somatotopic pattern. This demonstrates that action-semantics affects early brain processes, but the evidence is still correlational: reading action words affects motor areas, but does activity in motor areas also affect action-word processing? In a transcranial magnetic stimulation (TMS) study, stimulation of hand and leg motor cortex at 150 ms facilitated arm- and leg-word processing, respectively, but only for the left hemisphere. In a combined EEG/MEG study, participants initiated trials by button press either with their finger or their foot. Words were presented while the button was still pressed, i.e. motor cortex was activated. We found effects of word-effector congruency (e.g. finger button press followed by arm-word is congruent, but followed by a leg-word incongruent) both in hand motor cortex and in posterior superior temporal gyrus, i.e. a classical perisylvian language area, around 150 ms. This demonstrates that motor cortex activation specifically affects action-word processing at early latencies. Finally, neuropsychological studies (e.g. in semantic dementia patients) have shown that brain impairment affects different action-word categories to different degrees.
We conclude that there is converging evidence from different metabolic and electrophysiological neuroimaging techniques as well as neuropsychology supporting the idea of rapid access to embodied semantic representations for action-words in language comprehension.