Interpreting nonverbal cues to deception in real time

When questioning the veracity of an utterance, we perceive certain non-linguistic behaviours to indicate that a speaker is being deceptive. Recent work has highlighted that listeners' associations between speech disfluency and dishonesty are detectable at the earliest stages of reference compre...

Descripción completa

Guardado en:  
Detalles Bibliográficos
Autor principal: Corley, Martin (Autor)
Otros Autores: Rohde, Hannah ; Loy, Jia E. ; King, Josiah P. J.
Tipo de documento: Electrónico Artículo
Lenguaje:Inglés
Publicado: 2020
En: PLOS ONE
Año: 2020
Acceso en línea: Volltext (kostenfrei)
Volltext (kostenfrei)
Journals Online & Print:
Gargar...
Verificar disponibilidad: HBZ Gateway
Descripción
Sumario:When questioning the veracity of an utterance, we perceive certain non-linguistic behaviours to indicate that a speaker is being deceptive. Recent work has highlighted that listeners' associations between speech disfluency and dishonesty are detectable at the earliest stages of reference comprehension, suggesting that the manner of spoken delivery influences pragmatic judgements concurrently with the processing of lexical information. Here, we investigate the integration of a speaker's gestures into judgements of deception, and ask if and when associations between nonverbal cues and deception emerge. Participants saw and heard a video of a potentially dishonest speaker describe treasure hidden behind an object, while also viewing images of both the named object and a distractor object. Their task was to click on the object behind which they believed the treasure to actually be hidden. Eye and mouse movements were recorded. Experiment 1 investigated listeners' associations between visual cues and deception, using a variety of static and dynamic cues. Experiment 2 focused on adaptor gestures. We show that a speaker's nonverbal behaviour can have a rapid and direct influence on listeners' pragmatic judgements, supporting the idea that communication is fundamentally multimodal
ISSN:1932-6203
DOI:10.1371/journal.pone.0229486