Statistically Optimal Multisensory Cue Integration: A Practical Tutorial

in Multisensory Research
Restricted Access
Get Access to Full Text
Rent on DeepDyve

Have an Access Token?



Enter your access token to activate and access content online.

Please login and go to your personal user account to enter your access token.



Help

Have Institutional Access?



Access content through your institution. Any other coaching guidance?



Connect

Humans combine redundant multisensory estimates into a coherent multimodal percept. Experiments in cue integration have shown for many modality pairs and perceptual tasks that multisensory information is fused in a statistically optimal manner: observers take the unimodal sensory reliability into consideration when performing perceptual judgments. They combine the senses according to the rules of Maximum Likelihood Estimation to maximize overall perceptual precision. This tutorial explains in an accessible manner how to design optimal cue integration experiments and how to analyse the results from these experiments to test whether humans follow the predictions of the optimal cue integration model. The tutorial is meant for novices in multisensory integration and requires very little training in formal models and psychophysical methods. For each step in the experimental design and analysis, rules of thumb and practical examples are provided. We also publish Matlab code for an example experiment on cue integration and a Matlab toolbox for data analysis that accompanies the tutorial online. This way, readers can learn about the techniques by trying them out themselves. We hope to provide readers with the tools necessary to design their own experiments on optimal cue integration and enable them to take part in explaining when, why and how humans combine multisensory information optimally.

Statistically Optimal Multisensory Cue Integration: A Practical Tutorial

in Multisensory Research

Sections

References

AckermannJ. F.LandyM. S. (2013). Choice of saccade endpoint under riskJ. Vis. 1327. DOI:10.1167/13.3.27.

AdamsW. J.GrafE. W.ErnstM. O. (2004). Experience can change the “light-from-above” priorNat. Neurosci. 710571058.

AlaisD.BurrD. (2004). The ventriloquist effect results from near-optimal bimodal integrationCurr. Biol. 14257262.

BrescianiJ.-P.ErnstM. O.DrewingK.BouyerG.MauryV.KheddarA. (2005). Feeling what you hear: auditory signals can modulate tactile tap perceptionExp. Brain Res. 162172180.

BurgeJ.ErnstM. O.BanksM. S. (2008). The statistical determinants of adaptation rate in human reachingJ. Vis. 820119. DOI:10.1167/8.4.20.

BurgeJ.GirshickA. R.BanksM. S. (2010). Visual–haptic adaptation is determined by relative reliabilityJ. Neurosci. 3077147721.

ClarkJ. J.YuilleA. L. (1990). Data Fusion for Sensory Information Processing Systems. Kluwer AcademicBoston, MA, USA.

DiedrichsenJ. (2007). Optimal task-dependent changes of bimanual feedback control and adaptationCurr. Biol. 1716751679.

ErnstM. O. (2006). A Bayesian view on multimodal cue integration in: Human Body Perception from the Inside OutKnoblichG. (Ed.) pp.  105131. Oxford University PressNew York, NY, USA.

ErnstM. O. (2007). Learning to integrate arbitrary signals from vision and touchJ. Vis. 77114. DOI:10.1167/7.5.7.

ErnstM. O. (2012). Optimal multisensory integration: assumptions and limits in: The New Handbook of Multisensory ProcessesSteinB. E. (Ed.) pp.  10841124. MIT PressCambridge, MA, USA.

ErnstM. O.BanksM. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashionNature 415(6870) 429433.

ErnstM. O.BülthoffH. H. (2004). Merging the senses into a robust perceptTrends Cogn. Sci. 8162169.

FetschC. R.DeAngelisG. C.AngelakiD. E. (2010). Visual–vestibular cue integration for heading perception: applications of optimal cue integration theoryEur. J. Neurosci. 3117211729.

GepshteinS.BurgeJ.ErnstM. O.BanksM. S. (2005). The combination of vision and touch depends on spatial proximityJ. Vis. 5710131023. DOI:10.1167/5.11.7.

GescheiderG. (1997). Psychophysics: the Fundamentals3rd edn. Lawrence Erlbaum AssociatesMahwah, NJ, USA.

Hartcher-O’BrienJ.Di LucaM.ErnstM. O. (2014). The duration of uncertain times: audiovisual information about intervals is integrated in a statistically optimal fashionPLoS One 9e89339. DOI:10.1371/journal.pone.0089339.

HelbigH. B.ErnstM. O. (2007). Knowledge about a common source can promote visual–haptic integrationPerception 3615231533.

HillisJ. M.ErnstM. O.BanksM. S.LandyM. S. (2002). Combining sensory information: mandatory fusion within, but not between, sensesScience 298(5598) 16271630.

HillisJ. M.WattS. J.LandyM. S.BanksM. S. (2004). Slant from texture and disparity cues: optimal cue combinationJ. Vis. 4967992.

KleinerM.BrainardD.PelliD.InglingA.MurrayR.BroussardC. (2007). What’s new in Psychtoolbox-3? Perception 36ECVP Abstract Supplement.

KnillD. C. (1998). Discrimination of planar surface slant from texture: human and ideal observers comparedVis. Res. 3816831711.

KnillD. C.SaundersJ. A. (2003). Do humans optimally integrate stereo and texture information for judgments of surface slant? Vis. Res. 435392558.

KördingK. P.BeierholmU.MaW. J.QuartzS.TenenbaumJ. B.ShamsL. (2007). Causal inference in multisensory perceptionPLoS One 2e943. DOI:10.1371/journal.pone.0000943.

LandyM. S.MaloneyL. T.JohnstonE. B.YoungM. J. (1995). Measurement and modeling of depth cue combination: in defense of weak fusionVis. Res. 35389412.

LandyM. S.BanksM. S.KnillD. C. (2011). Ideal-observer models of cue integration in: Sensory Cue IntegrationTrommershäuserJ.KördingK.LandyM. S. (Eds) pp.  529. Oxford University PressNew York, NY, USA.

MoscatelliA.MezzettiM.LacquanitiF. (2012). Modeling psychophysical data at the population-level: the generalized linear mixed modelJ. Vis. 1226. DOI:10.1167/12.11.26.

NajemnikJ.GeislerW. S. (2005). Optimal eye movement strategies in visual searchNature 434(7031) 387391.

NajemnikJ.GeislerW. S. (2009). Simple summation rule for optimal fixation selection in visual searchVis. Res. 4912861294.

NewellF. N.ErnstM. O.TjanB. S.BülthoffH. H. (2001). Viewpoint dependence in visual and haptic object recognitionPsychol. Sci. 123742.

OruçI.MaloneyL. T.LandyM. S. (2003). Weighted linear cue combination with possibly correlated errorVis. Res. 4324512468.

PariseC. V.SpenceC.ErnstM. O. (2012). When correlation implies causation in multisensory integrationCurr. Biol. 224649.

PariseC. V.KnorreK.ErnstM. O. (2014). Natural auditory scene statistics shapes human spatial hearingProc. Natl Acad. Sci. USA 11161046108.

PlaisierM. A.Van DamL. C. J.GlowaniaC.ErnstM. O. (2014). Exploration mode affects visuohaptic integration of surface orientationJ. Vis. 1422112. DOI:10.1167/14.13.22.

RoachN. W.HeronJ.McGrawP. V. (2006). Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integrationProc. Biol. Sci. 273(1598) 21592168.

RockI.VictorJ. (1964). Vision and touch: an experimentally created conflict between the two sensesScience 143(3606) 594596.

ShamsL.KamitaniY.ShimojoS. (2002). Visual illusion induced by soundCogn. Brain Res. 14147152.

TodorovE.JordanM. I. (2002). Optimal feedback control as a theory of motor coordinationNat. Neurosci. 512261235.

TrommershäuserJ.MaloneyL. T.LandyM. S. (2003). Statistical decision theory and the selection of rapid, goal-directed movementsJ. Opt. Soc. Am. A 2014191433.

Van BeersR. J.SittigA. C.Van der GonJ. J. D. (1999). Integration of proprioceptive and visual position-information: an experimentally supported modelJ. Neurophysiol. 8113551364.

Van DamL. C. J.ErnstM. O. (2013). Knowing each random error of our ways, but hardly correcting for it: an instance of optimal performancePLoS One 8e78757. DOI:10.1371/journal.pone.0078757.

Van DamL. C. J.RohdeM. (2015). Maximum Likelihood Multisensory Integration Toolbox (http://www.mathworks.com/matlabcentral/fileexchange/50514-maximum-likelihood-multisensory-integration-toolbox) Matlab Central File Exchange retrieved 18.4.2015.

Van DamL. C. J.PariseC. V.ErnstM. O. (2014). Modeling multisensory integration in: Sensory Integration and the Unity of ConsciousnessBennettD. J.HillC. S. (Eds) pp.  209229. MIT PressCambridge MA, USA.

WeissY.SimoncelliE. P.AdelsonE. H. (2002). Motion illusions as optimal perceptsNat. Neurosci. 5598604.

WichmannF. A.HillN. J. (2001). The psychometric function: I. Fitting, sampling and goodness of fitPercept. Psychophys. 6312931313.

YuilleA. L.BülthoffH. H. (1996). Bayesian decision theory and psychophysics in: Bayesian Perspectives on Visual PerceptionKnillD. C.RichardsW. (Eds) pp.  123161. Cambridge University PressCambridge, MA, USA.

Figures

  • View in gallery

    Optimal integration of visual and haptic size cues. (A) Changes in bias and precision of the bimodal Likelihood Function (grey curve) depend on the relationship between uncertainty in the visual Likelihood Function (width of right black curve) and uncertainty in the haptic Likelihood Function (width of left black curve). As the visual uncertainty σV2 increases (top to bottom), the centre of the bimodal Likelihood Function moves closer to the haptic estimate EH (equation (1)). A hallmark of optimal integration is that the bimodal Likelihood Function is always narrower and higher (i.e., more precise, smaller σ) than both of the unimodal Likelihood Functions (black curves), as described in equation (2). (B) Illustration of visuohaptic size estimation with a crossmodal conflict Δ: The grasped object looks larger than it feels. (C) The changes in bimodal perception (grey Gaussians in panel A) modulate human discrimination performance (grey sigmoidal discrimination curves in panel C), which can be measured in psychophysical experiments (cf. Section 3).

  • View in gallery

    Psychometric functions. (A) JND and PSE in a psychometric function. (B) and (C) Example perceptual responses (black dots), psychometric curves (solid line) and parameter estimates (JND, PSE) for slant discrimination based on a binocular disparity cue. The two participants LD (B) and MR (C) have very large differences in JND. The results were recorded using the example experiment on slant discrimination (Van Dam and Rohde, 2015).

  • View in gallery

    Example results from an experiment on optimal cue integration. Depicted are results that were collected using the example experiment Bimodal2BaseSlants.m (cf. toolbox; Van Dam and Rohde, 2015). This experiment investigates whether a linear perspective cue C1 and a binocular disparity cue C2 are integrated optimally in the visual perception of a surface slant angle (which of the two surfaces was turned more towards the right? Replication of Hillis et al., 2004). In this experiment, two levels of conflict (Δa=15 with E1=7.5, E2=7.5 and Δb=15 with E1=7.5, E2=7.5) and two levels of noise (implemented as different base slants; cf. Knill and Saunders, 2003) are tested. Panels A, B, and C show example results from an individual participant, panel D shows the population results. (A) Unimodal results for perspective (black, C1) and disparity (grey, C2) cues for the two noise levels (left, right). (B) Bimodal results (solid lines, full circles) and MLE predictions (dotted lines, empty circles) for the two levels of conflict (grey, black) and noise levels (left, right). As texture uncertainty σ1 is lower for the first noise level (steeper black curve in panel A, left side), the bimodal Ebi are pulled towards E1 (left panel). For the second noise level, texture cues are less precise (shallower black curve in panel A, right side) and bimodal Ebi are pulled towards E2 (right panel). (C) Summary of the weight w1 (left) and uncertainty σ (right) results (empirical estimates and MLE model predictions) for the example participant. (D) Summary of the weight w1 (left) and uncertainty σ (right) results for the whole test population (n=8) with 95% confidence intervals on parameter estimates and predictions. Even though the cue weighting follows the predicted optimal weights w1,PR and the bimodal uncertainty follows the predicted uncertainty σbi,PR, the results are ambiguous: perceptual uncertainty with both cues is not significantly lower than the perceptual uncertainty with the better unimodal cue (cf. Section 3.5.2).

  • View in gallery

    Illustration of all possible results for the bimodal uncertainty σbi. (A) If cue integration is optimal, the σbi (light grey) is significantly lower than both of the unimodal σ1 (black) and σ2 (dark grey) and is not significantly different from the predicted optimal σbi,PR (white). Other possible results are: σbi is suboptimal (B, Section 3.5.1), ambiguous (C, Section 3.5.2), near-optimal (D, Section 3.5.3), or supraoptimal (E, Section 3.5.4).

Index Card

Content Metrics

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 19 19 14
Full Text Views 9 9 9
PDF Downloads 3 3 3
EPUB Downloads 0 0 0