PLoS By Category | Recent PLoS Articles

Anesthesiology and Pain Management - Biochemistry - Biophysics - Biotechnology - Cardiovascular Disorders - Chemical Biology - Chemistry - Computer Science - Critical Care and Emergency Medicine - Dermatology - Diabetes and Endocrinology - Ecology - Evidence-Based Healthcare - Gastroenterology and Hepatology - Geriatrics - Hematology - Immunology - Infectious Diseases - Mathematics - Mental Health - Microbiology - Molecular Biology - Nephrology - Neurological Disorders - Neuroscience - Non-Clinical Medicine - Nutrition - Obstetrics - Oncology - Ophthalmology - Otolaryngology - Pathology - Pediatrics and Child Health - Pharmacology - Physics - Physiology - Public Health and Epidemiology - Radiology and Medical Imaging - Respiratory Medicine - Rheumatology - Science Policy - Surgery - Urology - Virology - Women's Health


Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces
Published: Monday, January 30, 2012
Author: Simon Rigoulot et al.

by Simon Rigoulot, Marc D. Pell

Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.
  More...

 
//-->