In a natural environment, affective information is perceived via multiple senses, mostly audition and vision. However, the impact of multisensory information on affect remains relatively undiscovered. In this study, we investigated whether the auditory–visual presentation of aversive stimuli influences the experience of fear. We used the advantages of virtual reality to manipulate multisensory presentation and to display potentially fearful dog stimuli embedded in a natural context. We manipulated the affective reactions evoked by the dog stimuli by recruiting two groups of participants: dog-fearful and non-fearful participants. The sensitivity to dog fear was assessed psychometrically by a questionnaire and also at behavioral and subjective levels using a Behavioral Avoidance Test (BAT). Participants navigated in virtual environments, in which they encountered virtual dog stimuli presented through the auditory channel, the visual channel or both. They were asked to report their fear using Subjective Units of Distress. We compared the fear for unimodal (visual or auditory) and bimodal (auditory–visual) dog stimuli. Dog-fearful participants as well as non-fearful participants reported more fear in response to bimodal audiovisual compared to unimodal presentation of dog stimuli. These results suggest that fear is more intense when the affective information is processed via multiple sensory pathways, which might be due to a cross-modal potentiation. Our findings have implications for the field of virtual reality-based therapy of phobias. Therapies could be refined and improved by implicating and manipulating the multisensory presentation of the feared situations.
Perceiving control over aversive and fearful events can alter how we experience those events: An investigation of time perception in spider-fearful individualsFront. Psychol.3337. DOI:10.3389/fpsyg.2012.00337
ChenY.EdgarJ. C.HolroydT.DammersJ.ThönnessenH.RobertsT. P. L.MathiakK. (2010).
Neuromagnetic oscillations to emotional faces and prosodyEur. J. Neurosci.311818–1827. DOI:10.1111/j.1460-9568.2010.07203.x
De GelderB.BöckerK. B.TuomainenJ.HensenM.VroomenJ. (1999).
The combined perception of emotion from voice and face: Early interaction revealed by human electric brain responsesNeurosci. Lett.260133–136.
De GelderB.PourtoisG.WeiskrantzL. (2002).
Fear recognition in the voice is modulated by unconsciously recognized facial expressions but not by unconsciously recognized affective picturesProc. Natl Acad. Sci. USA994121–4126. DOI:10.1073/pnas.062018499
De GelderB.VroomenJ.De JongS. J.MasthoffE. D.TrompenaarsF. J.HodiamontP. (2005).
Multisensory integration of emotional faces and voices in schizophrenicsSchizophr. Res.72195–203. DOI:10.1016/j.schres.2004.02.013
De JongJ. J.HodiamontP. P. G.De GelderB. (2010).
Modality-specific attention and multisensory integration of emotions in schizophrenia: Reduced regulatory effectsSchizophr. Res.122136–143. DOI:10.1016/j.schres.2010.04.010
De JongJ. J.HodiamontP. P. G.Van den StockJ.De GelderB. (2009).
Audiovisual emotion recognition in schizophrenia: Reduced integration of facial and vocal affectSchizophr. Res.107286–293. DOI:10.1016/j.schres.2008.10.001
HaganC. C.WoodsW.JohnsonS.CalderA.GreenG. G. R.YoungA. W. (2009).
MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcusProc. Natl Acad. Sci. USA10620010–20015. DOI:10.1073/pnas.0905792106
KoizumiA.TanakaA.ImaiH.HiramatsuS.HiramotoE.SatoT.De GelderB. (2011).
The effects of anxiety on the interpretation of emotion in the face–voice pairsExp. Brain Res.213275–282. DOI:10.1007/s00221-011-2668-1
Audiovisual integration of emotional signals in voice and face: An event-related fMRI studyNeuroImage371445–11456. DOI:10.1016/j.neuroimage.2007.06.020
MagnéeM.De GelderB.Van EngelandH.KemnerC. (2007).
Facial electromyographic responses to emotional information from faces and voices in individuals with pervasive developmental disorderJ. Child Psychol. Psychiat.481122–1130. DOI:10.1111/j.1469-7610.2007.01779.x
MaurageP.PhilippotP.JoassinF.PauwelsL.PhamT.PrietoE. A.Palmero-SolerE.ZanowF.CampanellaS. (2008).
The auditory–visual integration of anger is impaired in alcoholism: An event-related potentials studyJ. Psychiatr. Neurosci.33111–122.
MoeckT.BonneelN.TsingosN.DrettakisG.Viaud-DelmonI.AllozaD. (2007). Progressive Perceptual audio rendering of complex scenes in: Proceedings of the 2007 Symposium on Interactive 3D Graphics and Games pp. 189–196. ACM New York NY USA.
MühlbergerA.SperberM.WieserM. J.PauliP. (2008).
A Virtual Reality Behavior Avoidance Test (VR-BAT) for the assessment of spider phobiaJ. CyberTher. Rehabil.1147–158.
Enter feelings: Somatosensory responses following early stages of visual induction of emotionInt. J. Psychophysiol.7213–23. DOI:10.1016/j.ijpsycho.2008.03.015
Auditory–visual virtual reality as a diagnostic and therapeutic tool for cynophobiaCyberpsychol. Behav. Soc. Netw.16145–152. DOI:10.1089/cyber.2012.1568
Behavioral models for anxiety and multisensory integration in animals and humansProgr. Neuro-Psychopharmacol. Biol. Psychiat.351391–1399. DOI:10.1016/j.pnpbp.2010.09.016
Viaud-DelmonI.ZnaïdiF.BonneelN.DoukhanD.SuiedC.WarusfelO.NguyenK. V.DrettakisG. (2008). Auditory–visual virtual environments to treat dog phobia in: Proceedings of the 7th ICDVRAT — International Conference on Disability Virtual Reality and Associated Technologies Maia Portugal pp. 119–124. University of Reading Reading UK.
VinesB. W.KrumhanslC. L.WanderleyM. M.DalcaI. M.LevitinD. J. (2011).
Music to my eyes: cross-modal interactions in the perception of emotions in musical performanceCognition118157–170. DOI:10.1016/j.cognition.2010.11.010