Human information processing is limited by attentional resources. Two questions that are discussed in multisensory research are (1) whether there are separate spatial attentional resources for each sensory modality and (2) whether multisensory integration is influenced by attentional load. We investigated these questions using a dual task paradigm: Participants performed two spatial tasks (a multiple object tracking [‘MOT’] task and a localization [‘LOC’] task) either separately (single task condition) or simultaneously (dual task condition). In the MOT task, participants visually tracked a small subset of several randomly moving objects. In the LOC task, participants either received visual, tactile, or redundant visual and tactile location cues. In the dual task condition, we found a substantial decrease in participants’ performance and an increase in participants’ mental effort (indicated by an increase in pupil size) relative to the single task condition. Importantly, participants performed equally well in the dual task condition regardless of whether they received visual, tactile, or redundant multisensory (visual and tactile) location cues in the LOC task. This result suggests that having spatial information coming from different modalities does not facilitate performance, thereby indicating shared spatial attentional resources for the tactile and visual modality. Also, we found that participants integrated redundant multisensory information optimally even when they experienced additional attentional load in the dual task condition. Overall, findings suggest that (1) spatial attentional resources for the tactile and visual modality overlap and that (2) the integration of spatial cues from these two modalities occurs at an early pre-attentive processing stage.
Purchase
Buy instant access (PDF download and unlimited online access):
Institutional Login
Log in with Open Athens, Shibboleth, or your institutional credentials
Personal login
Log in with your brill.com account
Alais D., Morrone C., Burr D. (2006). Separate attentional resources for vision and audition, Proc. Biol. Sci. 273(1592), 1339–1345.
Alnæs D., Sneve M. H., Espeseth T., Endestad T., Van de Pavert S. H. P., Laeng B. (2014). Pupil size signals mental effort deployed during multiple object tracking and predicts brain activity in the dorsal attention network and the locus coeruleus, J. Vis. 14(4), 1. DOI:10.1167/14.4.1.
Alsius A., Navarra J., Campbell R., Soto-Faraco S. (2005). Audiovisual integration of speech falters under high attention demands, Curr. Biol. 15, 839–843.
Alsius A., Navarra J., Soto-Faraco S. (2007). Attention to touch weakens audiovisual speech integration, Exp. Brain Res. 183, 399–404.
Alvarez G. A., Franconeri S. L. (2007). How many objects can you track?: Evidence for a resource-limited attentive tracking mechanism, J. Vis. 7(13), 14.
Arnell K. M., Jenkins R. (2004). Revisiting within-modality and cross-modality attentional blinks: effects of target–distractor similarity, Percept. Psychophys. 66, 1147–1161.
Arnell K. M., Larson J. M. (2002). Cross-modality attentional blinks without preparatory task-set switching, Psychonom Bull. Rev. 9, 497–506.
Arrighi R., Lunardi R., Burr D. (2011). Vision and audition do not share attentional resources in sustained tasks, Front. Psychol. 2, 56. DOI:10.3389/fpsyg.2011.00056.
Bates D., Maechler M., Bolker B., Walker S. (2014). lme4: linear mixed-effects models using Eigen and S4 (R package version 1.1-5 ed.), http://CRAN.R-project.org/package=lme4.
Beatty J. (1982). Task-evoked pupillary responses, processing load, and the structure of processing resources, Psychol. Bull. 91, 276–296.
Bertelson P., Vroomen J., De Gelder B., Driver J. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attention, Percept. Psychophys. 62, 321–332.
Bonnel A.-M., Prinzmetal W. (1998). Dividing attention between the color and the shape of objects, Percept. Psychophys. 60, 113–124.
Bruin J. (2014). R library: contrast coding systems for categorical variables.
Busse L., Roberts K. C., Crist R. E., Weissman D. H., Woldorff M. G. (2005). The spread of attention across modalities and space in a multisensory object, Proc. Natl Acad. Sci. U.S.A. 102, 18751–18756.
Calhoun G. L., Draper M. H., Ruff H. A., Fontejon J. V. (2002). Utility of a tactile display for cueing faults, Proc. Hum. Fact. Ergon. Soc. Annu. Meet. 46, 2144–2148.
Chun M. M., Golomb J. D., Turk-Browne N. B. (2011). A taxonomy of external and internal attention, Annu. Rev. Psychol. 62, 73–101.
Coren S., Ward L. M., Enns J. T. (2004). Sensation and Perception, 6th edn. Wiley, New York, NY, USA.
Duncan J., Martens S., Ward R. (1997). Restricted attentional capacity within but not between sensory modalities, Nature 397, 808–810.
Ernst M. O. (2006). A Bayesian view on multimodal cue integration, in: Human Body Perception from the Inside Out, Knoblich G., Thornton I. M., Grosjean M., Shiffrar M. (Eds), pp. 105–131. Oxford University Press, New York, NY, USA.
Ernst M. O., Bülthoff H. H. (2004). Merging the senses into a robust percept, Trends Cogn. Sci. 8, 162–169.
Gepshtein S., Burge J., Ernst M. O., Banks M. S. (2005). The combination of vision and touch depends on spatial proximity, J. Vis. 5, 1013–1023.
Green C. S., Bavelier D. (2006). Enumeration versus multiple object tracking: the case of action video game players, Cognition 101, 217–245.
Hagen M. C., Franzén O., McGlone F., Essick G., Dancer C., Pardo J. V. (2002). Tactile motion activates the human middle temporal/v5 (mt/v5) complex, Eur. J. Neurosci. 16, 957–964.
Hein G., Parr A., Duncan J. (2006). Within-modality and cross-modality attentional blinks in a simple discrimination task, Percept. Psychophys. 68, 54–61.
Hess E. H., Polt J. M. (1964). Pupil size in relation to mental activity during simple problem-solving, Science 143(3611), 1190–1192.
Hillstrom A. P., Shapiro K. L., Spence C. (2002). Attentional limitations in processing sequentially presented vibrotactile targets, Percept. Psychophys. 64, 1068–1082.
Holmes N. P., Spence C. (2005). Multisensory integration: space, time and superadditivity, Curr. Biol. 15, R762–R764.
James W. (1890). The Principles of Psychology. Harvard University Press, Cambridge, MA, USA.
Jolicoeur P. (1999). Restricted attentional capacity between sensory modalities, Psychonom. Bull. Rev. 6, 87–92.
Kietzmann T. C., Geuter S., König P. (2011). Overt visual attention as a causal factor of perceptual awareness, PloS One 6, e22614.
Koelewijn T., Bronkhorst A., Theeuwes J. (2010). Attention and the multiple stages of multisensory integration: a review of audiovisual studies, Acta Psychol. 134, 372–384.
LaBerge D. (1995). Attentional Processing: The Brain’s Art of Mindfulness, Vol. 2. Harvard University Press, Cambridge, MA. USA.
Leifeld P. (2013). texreg: conversion of statistical model output in R to LATEX and HTML tables, J. Stat. Softw. 55(8), 1–24.
Livingstone M., Hubel D. (1988). Segregation of form, color, movement, and depth: anatomy, physiology, and perception, Science 240(4853), 740–749.
Marois R., Ivanoff J. (2005). Capacity limits of information processing in the brain, Trends Cogn. Sci. 9, 296–305.
McGurk H., MacDonald J. (1976). Hearing lips and seeing voices, Nature 264(5588), 746–748.
Meredith M. A., Stein B. E. (1983). Interactions among converging sensory inputs in the superior colliculus, Science 221(4608), 389–391.
Mozolic J. L., Hugenschmidt C. E., Peiffer A. M., Laurienti P. J. (2008). Modality-specific selective attention attenuates multisensory integration, Exp. Brain Res. 184, 39–52.
Nagel S. K., Carl C., Kringe T., Märtin R., König P. (2005). Beyond sensory substitution — learning the sixth sense, J. Neural Eng. 2, R13–R26.
Nikolic M. I., Sklar A. E., Sarter N. B. (1998). Multisensory feedback in support of pilot-automation coordination: the case of uncommanded mode transitions, Proc. Hum. Fact. Ergon. Soc. Annu. Meet. 42, 239–243.
Potter M. C., Chun M. M., Banks B. S., Muckenhoupt M. (1998). Two attentional deficits in serial target search: the visual attentional blink and an amodal task-switch deficit, J. Exp. Psychol. Learn. Mem. Cogn. 24, 979–992.
Pylyshyn Z. W., Storm R. W. (1988). Tracking multiple independent targets: evidence for a parallel tracking mechanism, Spat. Vis. 3, 179–197.
Reed C. L., Klatzky R. L., Halgren E. (2005). What vs. where in touch: an fMRI study, Neuroimage 25, 718–726.
Sklar A. E., Sarter N. B. (1999). Good vibrations: tactile feedback in support of attention allocation and human-automation coordination in event-driven domains, Hum. Fact. 41, 543–552.
Soto-Faraco S., Spence C. (2002). Modality-specific auditory and visual temporal processing deficits, Q. J. Exp. Psychol. 55, 23–40.
Soto-Faraco S., Spence C., Fairbank K., Kingstone A., Hillstrom A. P., Shapiro K. (2002). A crossmodal attentional blink between vision and touch, Psychonom. Bull. Rev. 9, 731–738.
Soto-Faraco S., Navarra J., Alsius A. (2004). Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task, Cognition 92, B13–B23.
Stein B. E., Stanford T. R. (2008). Multisensory integration: current issues from the perspective of the single neuron, Nat. Rev. Neurosci. 9, 255–266.
Talsma D., Woldorff M. G. (2005). Selective attention and multisensory integration: multiple phases of effects on the evoked brain activity, J. Cogn. Neurosci. 17, 1098–1114.
Talsma D., Doty T. J., Strowd R., Woldorff M. G. (2006). Attentional capacity for processing concurrent stimuli is larger across sensory modalities than within a modality, Psychophysiology 43, 541–549.
Talsma D., Doty T. J., Woldorff M. G. (2007). Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cereb. Cortex 17, 679–690.
Tremblay S., Vachon F., Jones D. M. (2005). Attentional and perceptual sources of the auditory attentional blink, Percept. Psychophys. 67, 195–208.
Twisk J. W. (2006). Applied Multilevel Analysis. Cambridge University Press, Cambridge, MA, USA.
Van der Burg E., Olivers C. N. L., Bronkhorst A. W., Koelewijn T., Theeuwes J. (2007). The absence of an auditory–visual attentional blink is not due to echoic memory, Percept. Psychophys. 69, 1230–1241.
Van der Burg E., Olivers C. N., Bronkhorst A. W., Theeuwes J. (2008). Pip and pop: nonspatial auditory signals improve spatial visual search, J. Exp. Psychol. Hum. Percept. Perform. 34, 1053–1065.
Van Erp J. B., Van Veen H. A. (2004). Vibrotactile in-vehicle navigation system, Transp. Res. Part F: Traffic Psychol. Behav. 7, 247–256.
Vroomen J., Bertelson P., De Gelder B. (2001). The ventriloquist effect does not depend on the direction of automatic visual attention, Percept. Psychophys. 63, 651–659.
Wickham H. (2009). ggplot2: Elegant Graphics for Data Analysis. Springer, New York, NY, USA.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 1063 | 172 | 7 |
Full Text Views | 287 | 13 | 1 |
PDF Views & Downloads | 99 | 18 | 1 |
Human information processing is limited by attentional resources. Two questions that are discussed in multisensory research are (1) whether there are separate spatial attentional resources for each sensory modality and (2) whether multisensory integration is influenced by attentional load. We investigated these questions using a dual task paradigm: Participants performed two spatial tasks (a multiple object tracking [‘MOT’] task and a localization [‘LOC’] task) either separately (single task condition) or simultaneously (dual task condition). In the MOT task, participants visually tracked a small subset of several randomly moving objects. In the LOC task, participants either received visual, tactile, or redundant visual and tactile location cues. In the dual task condition, we found a substantial decrease in participants’ performance and an increase in participants’ mental effort (indicated by an increase in pupil size) relative to the single task condition. Importantly, participants performed equally well in the dual task condition regardless of whether they received visual, tactile, or redundant multisensory (visual and tactile) location cues in the LOC task. This result suggests that having spatial information coming from different modalities does not facilitate performance, thereby indicating shared spatial attentional resources for the tactile and visual modality. Also, we found that participants integrated redundant multisensory information optimally even when they experienced additional attentional load in the dual task condition. Overall, findings suggest that (1) spatial attentional resources for the tactile and visual modality overlap and that (2) the integration of spatial cues from these two modalities occurs at an early pre-attentive processing stage.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 1063 | 172 | 7 |
Full Text Views | 287 | 13 | 1 |
PDF Views & Downloads | 99 | 18 | 1 |