Feature Integration across Multimodal Perception and Action: A Review

in Multisensory Research
Restricted Access
Get Access to Full Text
Rent on DeepDyve

Have an Access Token?



Enter your access token to activate and access content online.

Please login and go to your personal user account to enter your access token.



Help

Have Institutional Access?



Access content through your institution. Any other coaching guidance?



Connect

The human brain is facing a continuous stream of stimulus information delivered by multiple modalities and sensory channels and processed in distinct cortical regions. We discuss recent empirical and theoretical developments in addressing the question of how this distributed information is integrated into coherent representations (the so-called binding problem) with an emphasis on the principles and constraints underlying the integration of multiple (rather than redundant) features across different sensory modalities and across perception and action planning.

Feature Integration across Multimodal Perception and Action: A Review

in Multisensory Research

Sections

References

BarlowH. B. (1972). Single units and sensation: a neuron doctrine for perceptual psychology? Perception 1371394.

BertelsonP.VroomenJ.de GeldeB.DriverJ. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attentionPercept. Psychophys. 62321332.

CinelC.HumphreysG. W.PoliR. (2002). Cross-modal illusory conjunctions between vision and touchJ. Exper. Psychol.: Human Percept. Perform. 2812431266.

ColzatoL. S.HommelB. (2008). Cannabis, cocaine, and visuomotor integration: evidence for a role of dopamine D1 receptors in binding perception and actionNeuropsychologia 4615701575.

ColzatoL. S.RaffoneA.HommelB. (2006a). What do we learn from binding features? Evidence for multilevel feature integrationJ. Exper. Psychol.: Human Percept. Perform. 32705716.

ColzatoL. S.van WouweN. C.LavenderT. J.HommelB. (2006b). Intelligence and cognitive flexibility: fluid intelligence correlates with feature ‘unbinding’ across perception and actionPsychonom. Bull. Rev. 1310431048.

ColzatoL. S.van WouweN. C.HommelB. (2007). Feature binding and affect: emotional modulation of visuo-motor integrationNeuropsychologia 45440446.

CorbettB. A.ConstantineL. J.HendrenR.RockeD.OzonoffS. (2009). Examining executive functioning in children with autism spectrum disorder, attention deficit hyperactivity disorder and typical developmentPsychiatry Res. 166210222.

deCharmsR. C.MerzenichM. M. (1996). Primary cortical representation of sounds by the coordination of action-potential timingNature 381610613.

DuncanJ.SeitzR. J.KolodnyJ.BorD.HerzogH.AhmedA.NewellF. N.EmslieH. (2000). A neural basis for general intelligenceScience 289457460.

DutziI. B.HommelB. (2009). The microgenesis of action-effect bindingPsycholog. Res. 73425435.

EngelA. K.SingerW. (2001). Temporal binding and the neural correlates of sensory awarenessTrends Cognit. Sci. 51625.

EngelA. K.KonigP.SingerW. (1991). Direct physiological evidence for scene segmentation by temporal codingProc. Nat. Acad. Sci. USA 8891369140.

ErnstM. O.BülthoffH. H. (2004). Merging the senses into a robust perceptTrends Cognit. Sci. 8162169.

EvansK. K.TreismanA. (2010). Natural cross-modal mappings between visual and auditory featuresJ. Vision 10112.

FoundA.MüllerH. J. (1996). Searching for unknown feature targets on more than one dimension: investigating a ‘dimension weighting’ accountPercept. Psychophys. 5888101.

GaoT.SchollB. J. (2010). Are objects required for object files? Roles of segmentation and spatiotemporal continuity in computing object persistenceVisual Cognition 1882109.

HallM. D.PastoreR. E.AckerB. E.HuangW. (2000). Evidence for auditory feature integration with spatially distributed itemsPercept. Psychophys. 612431257.

HillE. L. (2004). Evaluating the theory of executive dysfunction in autismDevelopmental Review 24189233.

HommelB. (1998). Event files: evidences for automatic integration of stimulus–response episodesVisual Cognition 5183216.

HommelB. (2004). Event files: feature binding in and across perception and actionTrends Cognit. Sci. 8494500.

HommelB. (2005). How much attention does an event file need? J. Exper. Psychol.: Human Percept. Perform. 3110671082.

HommelB. (2007). Feature integration across perception and action: event file affect response choicePsycholog. Res. 74263.

HommelB. (2009). Action control according to TEC (theory of event coding)Psycholog. Res. 73512526.

HommelB.ColzatoL. S. (2004). Visual attention and the temporal dynamics of feature integrationVisual Cognition 11483521.

HommelB.ColzatoL. S. (2009). When an object is more than a binding of its features: evidence for two mechanisms of visual feature integrationVisual Cognition 17120140.

HommelB.MüsselerJ.AscherslebenG.PrinzW. (2001). The Theory of Event Coding (TEC): a framework for perception and action planningBehav. Brain Sci. 24849937.

HommelB.KrayJ.LindenbergerU. (2011). Feature integration across the lifespan: stickier stimulus–response bindings in children and older adultsFrontiers Psychol. 2268.

HöttingK.RöderB. (2004). Hearing cheats touch, but less in congenitally blind than in sighted individualsPsycholog. Sci. 156064.

JeannerodM. (1997). The Cognitive Neuroscience of Action. Blackwell PublishersOxford, UK.

JoliotM.RibaryU.LlinásR. (1994). Human oscillatory brain activity near 40 Hz coexists with cognitiveProc. Nat. Acad. Sci. USA 911174811751.

JordanK. E.ClarkK.MitroffS. R. (2010). See an object, hear an object file: object correspondence transcends sensory modalityJ. Vision 18492503.

KahnemanD.TreismanA.GibbsB. J. (1992). The reviewing of object files: object-specific integration of informationCognit. Psychol. 24175219.

KeeleS. W.CohenA.IvryR. (1990). Motor programs: concepts and issues in: Attention and Performance: Motor Representation and ControlJeannerodM. (Ed.) Vol. 13 pp.  77110. ErlbaumHillsdale, NJ, USA.

KeizerA. W.NieuwenhuisS.ColzatoL. S.TheeuwisseW.RomboutsS. A. R. B.HommelB. (2008). When moving faces activate the house area: an fMRI study of object file retrievalBehav. Brain Functions 450.

KeizerA. W.VermentR.HommelB. (2010). Enhancing cognitive control through neurofeedback: a role of gamma-band activity in managing episodic retrievalNeuroimage 4934043413.

KühnS.KeizerA.ColzatoL. S.RomboutsS. A. R. B.HommelB. (2011). The neural underpinnings of event-file management: evidence for stimulus-induced activation of, and competition among stimulus-response bindingsJ. Cognit. Neurosci. 23896904.

KundeW.KieselA. (2006). See what you’ve done! Active touch affects the number of perceived visual objectsPsychonom. Bull. Rev. 13304309.

LeeC. C.WinerJ. A. (2005). Principles governing auditory cortex connectionsCerebral Cortex 1518041814.

LewaldJ.EhrensteinW. H.GuskiR. (2001). Spatio-temporal constraints for auditory–visual integrationBehav. Brain Res. 1216979.

LewkowiczD. J. (1996). Perception of auditory–visual temporal synchrony in human infantsJ. Exper. Psychol.: Human Percept. Perform. 2210941106.

McGinnisE. M.KeilA. (2011). Selective processing of multiple features in the human brain: effects of feature type and saliencePLoS ONE 6e16824. DOI:10.1371/journal.pone.0016824.

McGurkH.MacDonaldJ. (1976). Hearing lips and seeing voicesNature 264746748.

MitroffS. R.AlvarezG. A. (2007). Space and time, not surface features, guide object persistencePsychonom. Bull. Rev. 1411991204.

MemelinkJ.HommelB. (in press). Intentional weighting: a basic principle in cognitive controlPsycholog. Res.; published online in 2012: DOI: 10.1007/s00426-012-0435-y.

MondorT. A.HurlburtJ.ThorneL. (2003). Categorizing sounds by pitch: effects of stimulus similarity and response repetitionPercept. Psychophys. 65107114.

MurthyV. N.FetzE. E. (1992). Coherent 25- to 35-Hz oscillations in the sensorimotor cortex of awake behaving monkeysProc. Nat. Acad. Sci. USA 8956705674.

MurthyV. N.FetzE. E. (1996). Oscillatory activity in sensorimotor cortex of awake monkeys: synchronization of local field potentials and relation to behaviorJ. Neurophysiol. 7639493967.

NicolelisM. A.BaccalaL. A.LinR. C.ChapinJ. K. (1995). Sensorimotor encoding by synchronous neural ensemble activity at multiple levels of the somatosensory systemScience 26813531358.

NoëA. (2004). Action in Perception. MIT PressCambridge, MA, USA.

NolesN. S.SchollB. J.MitroffS. R. (2005). The persistence of object file representationsPercept. Psychophys. 67324334.

RaffoneA.WoltersG. (2001). A cortical mechanism for binding in visual working memoryJ. Cognit. Neurosci. 13766785.

RoelfsemaP. R.EngelA. K.KonigP.SingerW. (1997). Visuomotor integration is associated with zero time-lag synchronization among cortical areasNature 385157161.

ShamsL.KamitaniY.ShimojoS. (2000). What you see is what you hearNature 408788.

SingerW. (1994). A new job for the thalamusNature 369444445.

SpapéM.HommelB. (2010). Actions travel with their objects: evidence for dynamic event filesPsycholog. Res. 745058.

TakegataR.BratticoE.TervaniemiM.VaryaginaO.NäätänenR.WinklerI. (2005). Preattentive representation of feature conjunctions for concurrent spatially distributed audition objectsCognit. Brain Res. 25169179.

Tallon-BaudryC.BertrandO. (1999). Oscillatory gamma activity in humans and its role in object representationTrends Cognit. Sci. 3151162.

TreismanA. (1996). The binding problemCurr. Opinion Neurobiol. 6171178.

TreismanA. M.GeladeG. (1980). A feature-integration theory of attentionCognit. Psychol. 1297136.

van DamW. O.HommelB. (2010). How object-specific are object files? Evidence for integration by locationJ. Exper. Psychol.: Human Percept. Perform. 3611841192.

VanRullenR. (2009). Binding hardwired versus on-demand feature conjunctionVisual Cognition 17103119.

van SteenbergenH.BandG. P. H.HommelB. (2009). Reward counteracts conflict adaptation: evidence for a role of affect in executive controlPsycholog. Sci. 2014731477.

van WassenhoveV.GrantK. W.PoeppelD. (2007). Temporal window of integration in bimodal speechNeuropsychologia 45598607.

ViolentyevA.ShimojoS.ShamsL. (2005). Touch-induced visual illusionNeuroReport 1611071110.

von der MalsburgC. (1999). The what and why of binding: the modeler’s perspectiveNeuron 2495104.

von SteinA.RappelsbergerP.SarntheinJ.PetscheH. (1999). Synchronization between temporal and parietal cortex during multimodal object processing in manCerbral Cortex 9137150.

VroomenJ.BertelsonP.de GelderB. (2001). The ventriloquist effect does not depend on the direction of automatic visual attentionPercept. Psychophys. 63651659.

ZekiS. (1993). A Vision of the Brain. Blackwell ScientificOxford, UK.

ZmigrodS.HommelB. (2009). Auditory event files: integrating auditory perception and action planningAttention Percept. Psychophys. 71352362.

ZmigrodS.HommelB. (2010). Temporal dynamics of unimodal and multimodal feature bindingAttention Percept. Psychophys. 72142152.

ZmigrodS.HommelB. (2011). The relationship between feature binding and consciousness: evidence from asynchronous multi-modal stimuliConscious. Cognit. 20586593.

ZmigrodS.SpapéM.HommelB. (2009). Intermodal event files: integrating features across vision, audition, taction, and actionPsycholog. Res. 73674684.

ZmigrodS.de SonnevilleL. M.ColzatoL. S.SwaabH.HommelB. (in press). Cognitive control of feature bindings: evidence from children with autistic spectrum disorderPsycholog. Res.; published online in 2011: DOI: 10.1007/s00426-011-0399-3.

Figures

  • View in gallery

    Schematic illustration of the three basic paradigms. (A) Object-reviewing paradigm (upper raw) — The first stimulus S1 is a combination of two features: identity and location. The second stimulus S2 is either a complete repetition of S1 with regard to identity and location, complete alternation of S1, or partial repetition of either identity or the location. S2 signaled the response, a speeded left or right response according to the task (either to the identity or to the location). (B) Event-files paradigm. A visual response cue signals a left or right response (R1) that should be delayed until presentation of the first stimulus S1 which is again combination of identity and location features (S1 is used as a detection signal for R1). The second stimulus S2, also combination of identity and location features (complete repetition or alternation or partial repetition of the S1’s features), appears after responding to S1. S2 signals R2, a speeded left or right response according to the task (either to the identity or to the location). As R1 is independent of the features of S1, this design allows varying response repetition (R1 being the same as, or different from R2) independently of stimulus-feature repetition. (C) Intermodal event-files paradigm. Similar to B, but the stimuli are combinations of auditory and visual features (such as pitch and color) that can be presented either in synchronous or asynchronous manner. This figure is published in colour in the online version.

  • View in gallery

    (A) Partial repetition costs (see Note 1) of unimodal (loudness–pitch) and multimodal (color–pitch) feature integration as a function of response–stimulus interval (RSI). (B) Partial repetition costs of feature integration across perception and action (loudness-response and color-response) (Zmigrod and Hommel, 2010).

Information

Content Metrics

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 22 22 8
Full Text Views 67 67 37
PDF Downloads 4 4 1
EPUB Downloads 0 0 0