Audiovisual Integration Varies With Target and Environment Richness in Immersive Virtual Reality

in Multisensory Research
No Access
Get Access to Full Text
Rent on DeepDyve

Have an Access Token?



Enter your access token to activate and access content online.

Please login and go to your personal user account to enter your access token.



Help

Have Institutional Access?



Access content through your institution. Any other coaching guidance?



Connect

Abstract

We are continually bombarded by information arriving to each of our senses; however, the brain seems to effortlessly integrate this separate information into a unified percept. Although multisensory integration has been researched extensively using simple computer tasks and stimuli, much less is known about how multisensory integration functions in real-world contexts. Additionally, several recent studies have demonstrated that multisensory integration varies tremendously across naturalistic stimuli. Virtual reality can be used to study multisensory integration in realistic settings because it combines realism with precise control over the environment and stimulus presentation. In the current study, we investigated whether multisensory integration as measured by the redundant signals effects (RSE) is observable in naturalistic environments using virtual reality and whether it differs as a function of target and/or environment cue-richness. Participants detected auditory, visual, and audiovisual targets which varied in cue-richness within three distinct virtual worlds that also varied in cue-richness. We demonstrated integrative effects in each environment-by-target pairing and further showed a modest effect on multisensory integration as a function of target cue-richness but only in the cue-rich environment. Our study is the first to definitively show that minimal and more naturalistic tasks elicit comparable redundant signals effects. Our results also suggest that multisensory integration may function differently depending on the features of the environment. The results of this study have important implications in the design of virtual multisensory environments that are currently being used for training, educational, and entertainment purposes.

Audiovisual Integration Varies With Target and Environment Richness in Immersive Virtual Reality

in Multisensory Research

Sections

References

AlaisD. and BurrD. (2004). The ventriloquist effect results from near-optimal bimodal integrationCurr. Biol. 14257262.

BohilC. J.AliceaB. and BioccaF. A. (2011). Virtual reality in neuroscience research and therapyNat. Rev. Neurosci. 12752762.

BrandweinA. B.FoxeJ. J.RussoN. N.AltschulerT. S.GomesH. and MolholmS. (2011). The development of audiovisual multisensory integration across childhood and early adolescence: a high-density electrical mapping studyCereb. Cortex 2110421055.

BrandweinA. B.FoxeJ. J.ButlerJ. S.RussoN. N.AltschulerT. S.GomesH. and MolholmS. (2013). The development of multisensory integration in high-functioning autism: high-density electrical mapping and psychophysical measures reveal impairments in the processing of audiovisual inputsCereb. Cortex 2313291341.

CamposJ. L.ButlerJ. S. and BülthoffH. H. (2012). Multisensory integration in the estimation of walked distancesExp. Brain Res. 218551565.

ChalmersA.DebattistaK. and Ramic-BrkicB. (2009). Towards high-fidelity multi-sensory virtual environmentsVis. Comput. 2511011108.

ChiT.RuP. and ShammaS. A. (2005). Multiresolution spectrotemporal analysis of complex soundsJ. Acoust. Soc. Am. 118887906.

ColoniusH. and DiederichA. (2004). Multisensory interaction in saccadic reaction time: a time-window-of-integration modelJ. Cogn. Neurosci. 1610001009.

ColoniusH. and DiederichA. (2006). The race model inequality: interpreting a geometric measure of the amount of violationPsychol. Rev. 113148154.

De MeoR.MurrayM. M.ClarkeS. and MatuszP. J. (2015). Top-down control and early multisensory processes: chicken vs. eggFront. Integr. Neurosci. 917. DOI:10.3389/fnint.2015.00017.

DiederichA. and ColoniusH. (2004). Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction timePercept. Psychophys. 6613881404.

DoehrmannO. and NaumerM. J. (2008). Semantics and the multisensory brain: how meaning modulates processes of audio-visual integrationBrain Res. 1242136150.

FengM.DeyA. and LindemanR. W. (2016). The effect of multi-sensory cues on performance and experience during walking in immersive virtual environments in: 2016 IEEE Virtual Reality (VR) pp. 173174. Greenville, SC, USA.

FoxeJ. J.MoroczI. A.MurrayM. M.HigginsB. A.JavittD. C. and SchroederC. E. (2000). Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mappingBrain Res. Cogn. Brain Res. 107783.

GiardM. H. and PeronnetF. (1999). Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological studyJ. Cogn. Neurosci. 11473490.

GibneyK. D.AligbeE.EgglestonB. A.NunesS. R.KerkhoffW. G.DeanC. L. and KwakyeL. D. (2017). Visual distractors disrupt audiovisual integration regardless of stimulus complexityFront. Integr. Neurosci. 111. DOI:10.3389/fnint.2017.00001.

GondanM. (2010). A permutation test for the race model inequalityBehav. Res. Methods 422328.

GondanM. and MinakataK. (2016). A tutorial on testing the race model inequalityAtten. Percept. Psychophys. 78723735.

HairstonW. D.LaurientiP. J.MishraG.BurdetteJ. H. and WallaceM. T. (2003). Multisensory enhancement of localization under conditions of induced myopiaExp. Brain Res. 152404408.

HugenschmidtC. E.MozolicJ. L. and LaurientiP. J. (2009). Suppression of multisensory integration by modality-specific attention in agingNeuroreport 20349353.

HughesH. C.NelsonM. D. and AronchickD. M. (1998). Spatial characteristics of visual-auditory summation in human saccadesVision Res. 3839553963.

JessenS. and KotzS. A. (2015). Affect differentially modulates brain activation in uni- and multisensory body-voice perceptionNeuropsychologia 66134143.

JuanC.CappeC.AlricB.RobyB.GilardeauS.BaroneP. and GirardP. (2017). The variability of multisensory processes of natural stimuli in human and non-human primates in a detection taskPLoS One 12e0172480. DOI:10.1371/journal.pone.0172480.

KingA. J. and PalmerA. R. (1985). Integration of visual and auditory information in bimodal neurones in the Guinea-pig superior colliculusExp. Brain Res. 60492500.

KnoeferleK. M.KnoeferleP.VelascoC. and SpenceC. (2016). Multisensory brand search: how the meaning of sounds guides consumers’ visual attentionJ. Exp. Psychol. Appl. 22196210.

LenggenhagerB.TadiT.MetzingerT. and BlankeO. (2007). Video ergo sum: manipulating bodily self-consciousnessScience 31710961099.

Lopez MaïtéC.GaétaneD. and AxelC. (2016). Ecological assessment of divided attention: what about the current tools and the relevancy of virtual realityRev. Neurol. (Paris) 172270280.

LovelaceC. T.SteinB. E. and WallaceM. T. (2003). An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detectionBrain Res. Cogn. Brain Res. 17447453.

MallickD. B.MagnottiJ. F. and BeauchampM. S. (2015). Variability and stability in the McGurk effect: contributions of participants, stimuli, time, and response typePsychon. Bull. Rev. 2212991307.

MaselliA. and SlaterM. (2013). The building blocks of the full body ownership illusionFront. Hum. Neurosci. 783. DOI:10.3389/fnhum.2013.00083.

McGurkH. and MacDonaldJ. (1976). Hearing lips and seeing voicesNature 264746748.

MeredithM. A. and SteinB. E. (1986). Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integrationJ. Neurophysiol. 56640662.

MeredithM. A.NemitzJ. W. and SteinB. E. (1987). Determinants of multisensory integration in superior colliculus neurons. I. Temporal factorsJ. Neurosci. 732153229.

MeyerG. F.ShaoF.WhiteM. D.HopkinsC. and RobothamA. J. (2013). Modulation of visually evoked postural responses by contextual visual, haptic and auditory information: a “virtual reality check”PLoS One 8e67651. DOI:10.1371/journal.pone.0067651.

MillerJ. (1982). Divided attention: evidence for coactivation with redundant signalsCogn. Psychol. 14247279.

MillerJ.UlrichR. and LamarreY. (2001). Locus of the redundant-signals effect in bimodal divided attention: a neurophysiological analysisPercept. Psychophys. 63555562.

MolholmS.RitterW.MurrayM. M.JavittD. C.SchroederC. E. and FoxeJ. J. (2002). Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping studyBrain Res. Cogn. Brain Res. 14115128.

MolholmS.SehatpourP.MehtaA. D.ShpanerM.Gomez-RamirezM.OrtigueS.DykeJ. P.SchwartzT. H. and FoxeJ. J. (2006). Audio-visual multisensory integration in superior parietal lobule revealed by human intracranial recordingsJ. Neurophysiol. 96721729.

MozolicJ. L.HugenschmidtC. E.PeifferA. M. and LaurientiP. J. (2008). Modality-specific selective attention attenuates multisensory integrationExp. Brain Res. 1843952.

MurrayM. M.MolholmS.MichelC. M.HeslenfeldD. J.RitterW.JavittD. C.SchroederC. E. and FoxeJ. J. (2005). Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignmentCereb. Cortex 15963974.

NaceriA. and ChellaliR. (2011). Depth perception within peripersonal space using head-mounted displayPresence (Camb.) 20254272.

NormandJ.-M.GiannopoulosE.SpanlangB. and SlaterM. (2011). Multisensory stimulation can induce an illusion of larger belly size in immersive virtual realityPLoS One 6e16128. DOI:10.1371/journal.pone.0016128.

NozawaG.Reuter-LorenzP. A. and HughesH. C. (1994). Parallel and serial processes in the human oculomotor system: bimodal integration and express saccadesBiol. Cybern. 721934.

OlivaA.MackM. L.ShresthaM. and PeeperA. (2004). Identifying the perceptual dimensions of visual complexity of scenes in: Proc. 26th Annual Meeting of the Cognitive Science Society. Chicago, IL, USA.

PelliD. G. and BexP. (2013). Measuring contrast sensitivityVision Res. 901014.

PrestonC. and NewportR. (2012). How long is your arm? Using multisensory illusions to modify body image from the third person perspectivePerception 41247249.

QuakM.LondonR. E. and TalsmaD. (2015). A multisensory perspective of working memoryFront. Hum. Neurosci. 9197. DOI:10.3389/fnhum.2015.00197.

RaposoD.SheppardJ. P.SchraterP. R. and ChurchlandA. K. (2012). Multisensory decision-making in rats and humansJ. Neurosci. 3237263735.

RussoN.MottronL.BurackJ. A. and JemelB. (2012). Parameters of semantic multisensory integration depend on timing and modality order among people on the autism spectrum: evidence from event-related potentialsNeuropsychologia 5021312141.

SarmientoB. R.MatuszP. J.SanabriaD. and MurrayM. M. (2016). Contextual factors multiplex to control multisensory processesHum. Brain Mapp. 37273288.

SchwarzW. (2006). On the relationship between the redundant signals effect and temporal order judgments: parametric data and a new modelJ. Exp. Psychol. Hum. Percept. Perform. 32558573.

SellaI.ReinerM. and PrattH. (2014). Natural stimuli from three coherent modalities enhance behavioral responses and electrophysiological cortical activity in humansInt. J. Psychophysiol. 934555.

SenkowskiD.Saint-AmourD.KellyS. P. and FoxeJ. J. (2007). Multisensory processing of naturalistic objects in motion: a high-density electrical mapping and source estimation studyNeuroimage 36877888.

SenkowskiD.Saint-AmourD.HöfleM. and FoxeJ. J. (2011). Multisensory interactions in early evoked brain activity follow the principle of inverse effectivenessNeuroimage 5622002208.

ShamsL.KamitaniY. and ShimojoS. (2000). Illusions. what you see is what you hearNature 408788.

ShamsL.KamitaniY. and ShimojoS. (2002). Visual illusion induced by soundBrain Res. Cogn. Brain Res. 14147152.

ShimojoS. (2001). Sensory modalities are not separate modalities: plasticity and interactionsCurr. Opin. Neurobiol. 11505509.

SinnettS.SpenceC. and Soto-FaracoS. (2007). Visual dominance and attention: the Colavita effect revisitedPercept. Psychophys. 69673686.

SlobounovS. M.RayW.JohnsonB.SlobounovE. and NewellK. M. (2015). Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG studyInt. J. Psychophysiol. 95254260.

SteinB. E.HuneycuttW. S. and MeredithM. A. (1988). Neurons and behavior: the same rules of multisensory integration applyBrain Res. 448355358.

SteinB. E.MeredithM. A.HuneycuttW. S. and McDadeL. (1989). Behavioral indices of multisensory integration: orientation to visual cues is affected by auditory stimuliJ. Cogn. Neurosci. 11224.

StevensonR. A. and WallaceM. T. (2013). Multisensory temporal integration: task and stimulus dependenciesExp. Brain Res. 227249261.

StevensonR. A.GhoseD.FisterJ. K.SarkoD. K.AltieriN. A.NidifferA. R.KurelaL. R.SiemannJ. K.JamesT. W. and WallaceM. T. (2014). Identifying and quantifying multisensory integration: a tutorial reviewBrain Topogr. 27707730.

SunH.-J.CamposJ. L. and ChanG. S. W. (2004). Multisensory integration in the estimation of relative path lengthExp. Brain Res. 154246254.

TalsmaD.DotyT. J. and WoldorffM. G. (2007). Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration?Cereb. Cortex 17679690.

TalsmaD.SenkowskiD.Soto-FaracoS. and WoldorffM. G. (2010). The multifaceted interplay between attention and multisensory integrationTrends Cogn. Sci. 14400410.

TangX.WuJ. and ShenY. (2016). The interactions of multisensory integration with endogenous and exogenous attentionNeurosci. Biobehav. Rev. 61208224.

Teder-SälejärviW. A.Di RussoF.McDonaldJ. J. and HillyardS. A. (2005). Effects of spatial congruity on audio-visual multimodal integrationJ. Cogn. Neurosci. 1713961409.

ThelenA.TalsmaD. and MurrayM. M. (2015). Single-trial multisensory memories affect later auditory and visual object discriminationCognition 138148160.

TsilionisE. and VatakisA. (2016). Multisensory binding: is the contribution of synchrony and semantic congruency obligatory?Curr. Opin. Behav. Sci. 8713.

UlrichR. and MillerJ. (1997). Tests of race models for reaction time in experiments with asynchronous redundant signalsJ. Math. Psychol. 41367381.

Van der BurgE.OliversC. N. L.BronkhorstA. W. and TheeuwesJ. (2008). Pip and pop: nonspatial auditory signals improve spatial visual searchJ. Exp. Psychol. Hum. Percept. Perform. 3410531065.

Van der BurgE.OliversC. N. L.BronkhorstA. W. and TheeuwesJ. (2009). Poke and pop: tactile-visual synchrony increases visual saliencyNeurosci. Lett. 4506064.

Van der Stoep N.Van der StigchelS.NijboerT. C. W. and Van der SmagtM. J. (2016). Audiovisual integration in near and far space: effects of changes in distance and stimulus effectivenessExp. Brain Res. 23411751188.

VealeR.HafedZ. M. and YoshidaM. (2017). How is visual salience computed in the brain? Insights from behaviour neurobiology and modelling Philos. Trans. R. Soc. Lond. B Biol. Sci. 372.

WaltherD. and KochC. (2006). Modeling attention to salient proto-objectsNeural Netw. 1913951407.

Figures

  • View in gallery

    Design of the audiovisual speeded detection task

  • View in gallery

    Targets and environments. Participants detected visual, auditory, and simultaneous audiovisual targets varying in cue-richness in virtual environments that also varied in their cue-richness. Auditory waveforms for targets of the corresponding cue-richness are shown below the virtual environments.

  • View in gallery

    Response times separated by environment and target cue-richness. Audiovisual response times were significantly shorter than unisensory response times for all target by environment combinations. Error bars represent the SEM; * indicates significant differences compared to audiovisual trials at p<0.001 (Bonferroni-corrected alpha level).

  • View in gallery

    Mean response times in milliseconds separated by target, environment, and modality. Standard deviations for each condition are listed in parentheses

  • View in gallery

    RMI violation grouped by environment cue-richness. All target and environment cue-richness combinations resulted in significant violations of the race model as defined by at least one time point with a positive RMI violation at the p<0.001 level. Target cue-richness significantly modulated RMI violation in the cue-rich environment only. # indicates a significant main effect of target cue-richness at p<0.05; * indicates a significant main effect of target cue-richness at p<0.003 (Bonferroni-correct alpha level).

  • View in gallery

    Results of paired-sample t-tests between race model CDF and multisensory CDF

  • View in gallery

    Total positive area under the RMI violation curve separated by target and environment cue-richness. Target cue-richness modulated the pAUC in the cue-rich environment only. Error bars represent the SEM; *indicates p<0.008 (Bonferroni-corrected alpha level).

Information

Content Metrics

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 116 116 32
Full Text Views 166 166 69
PDF Downloads 14 14 1
EPUB Downloads 0 0 0