Vol. 17, No. 2
Meaning in gestures: What event-related potentials reveal about processes underlying the comprehension of iconic gestures
Department of Cognitive Science, University of California, San Diego
In conversation, communication transpires through both talk and gesture. While describing a platter, a speaker might trace the outline of its shape in the air. Does understanding such iconic gestures engage processes similar to those recruited in the comprehension of pictures, photographs, and other image-based representations of the visual world? Previous research using event-related potentials (ERPs) has demonstrated that the second member of an unrelated picture pair results in enhanced negative going deflections of the ERP waveform around 300 ms (N300) and 400 ms (N400) post-stimulus onset as compared to responses elicited by related picture probes. To test whether the semantic analysis of gestures elicits similar effects, we recorded the electroencephalogram (EEG) from 16 healthy adults as they viewed spontaneously produced iconic co-speech gestures preceded by congruous and incongruous contexts. Gestures were presented either dynamically in the form of short, soundless video-clips or statically in the form of freeze frames extracted from gesture videos. N400-like effects were observed in response to both gesture types. Static gesture stills also elicited an N300 effect. These findings demonstrate that understanding gestures involves processes similar to those underlying the comprehension of other meaningful representations, including words and pictures.