Gestures Help Listeners Predict Speech

Summary: Humans use hands gestures during talk, but until now, it was n’t distinct whether listeners deliberately use these movements to predict speech. A new research using online characters shows that listeners however use classic gestures—like miming typing—to determine what the speaker did say next.

Ultrasound recordings revealed that for gestures set mental activity associated with excitement and relieve language processing when the word is spoken. The results suggest that substantial gestures, also from avatars, improve communication and should be integrated into human-like Iot agents.

Important Information:

    Gesture Predictive Power: Audiences predicted future terms more effectively when images used related movements.

  • Brain Response Evidence: EEG data showed gestures affected anticipation and reduced processing difficulty ( N400 effect ).
  • Avatar Commun

Origin: Max Planck Institute

In face-to-face meetings, speakers use finger movements to sign meaning. But would listeners really use these gestures to determine what someone may say next?

In a research using online avatars, scientists from the Max Planck Institute for Psycholinguistics and Radboud University in Nijmegen demonstrate that listeners used the avatar’s gestures to predict future conversation.

As expected, participants predicted the target word ( for instance, “type” ) more often when they had seen the corresponding gesture. Credit: Neuroscience News

Both behavioural and EEG data indicated that hand gestures facilitate language processing, illustrating the multimodal nature of human communication.

People might wiggle their fingers when they talk about typing, depicting a ‘typing ’ movement. Seeing meaningful hand movements—also called  iconic gestures—helps listeners to process spoken language.

“We already know that questions produced with iconic gestures get faster responses in conversation”, says first author Marlijn ter Bekke.

Hand movements might speed up language processing because they help to predict what is coming up.

“Gestures typically start before their corresponding speech ( such as the word “typing” ), so they already show some information about what the speaker might say next”, explains Ter Bekke.

To investigate whether listeners use hand gestures to  predict  upcoming speech, the researchers decided to run two experiments using visual avatars.

“We used virtual avatars because we can control precisely what they say and how they move, which is good for drawing conclusions from experiments. At the same time, they look natural. ”

Predicting the target word

In the first experiment, participants listened to questions asked by the avatar, such as “How old were you  when you learned to … type? ”, with a pause before the target word ( “type” ).

The avatar either made a typing gesture, a meaningless control movement ( such as an arm scratch ) or no movement. Participants heard the question up to the target word and were asked to guess how it would continue.

As expected, participants predicted the target word ( for instance, “type” ) more often when they had seen the corresponding gesture.

Brain waves

In the second experiment, a different set of participants simply listened to the questions played in full. Their brain activity was recorded with electroencephalography ( EEG).

During the silent pause before the target word, gestures affected brain waves that are typically associated with anticipation. After the target word, gestures affected brain responses that indicate how difficult it is to understand a word ( a reduced N400 effect ). After seeing gestures, people found it easier to process the meaning of the upcoming word.

“These results show that even when participants are just listening, they use gestures to predict what someone might say next”, concludes Ter Bekke.

Robots and virtual avatars

  “Our study shows that even gestures produced by a virtual avatar facilitate language processing. If we want artificial agents ( like robots or virtual avatars ) to be readily understood, and in a human-like way, they should not only communicate with speech, but also with meaningful hand gestures. ”

About this language and communication research news

Author: Anniek Corporaal
Source: Max Planck Institute
Contact: Anniek Corporaal – Max Planck Institute
Image: The image is credited to Neuroscience News

Original Research: Open access.
Co-Speech Hand Gestures Are Used to Predict Upcoming Meaning ” by Marlijn ter Bekke et al. Psychological Science


Abstract

Co-Speech Hand Gestures Are Used to Predict Upcoming Meaning

In face-to-face conversation, people use speech and gesture to convey meaning. Seeing gestures alongside speech facilitates comprehenders ’ language processing, but crucially, the mechanisms underlying this facilitation remain unclear.

We investigated whether comprehenders use the semantic information in gestures, typically preceding related speech, to predict upcoming meaning.

Dutch adults listened to questions asked by a virtual avatar.

Questions were accompanied by an iconic gesture ( e. g. , typing ) or meaningless control movement ( e. g. , arm scratch ) followed by a short pause and target word ( e. g. , “type” ). A Cloze experiment showed that gestures improved explicit predictions of upcoming target words.

Moreover, an EEG experiment showed that gestures reduced alpha and beta power during the pause, indicating anticipation, and reduced N400 amplitudes, demonstrating facilitated semantic processing. Thus, comprehenders use iconic gestures to predict upcoming meaning.

Theories of linguistic prediction should incorporate communicative bodily signals as predictive cues to capture how language is processed in face-to-face interaction.

Share This Post

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Do You Want To Boost Your Business?

drop us a line and keep in touch

[ihc-register]