When presenting information in vehicle cockpits, it is essential to convey an appropriate urgency to the drivers. Perceived urgency has been investigated over the years for each modality, particularly audition and vision. However, the interaction between the modalities of perceived urgency has rarely been examined. To expand the insight into the design application of information presentation, we investigated the audio–visual interaction of perceived urgency using a priming task that involved speeded visual-target discrimination. A total of 60 auditory stimuli were created using a synthesizer. In addition, 13 color squares were used as visual stimuli. Three auditory stimuli with high, medium, and low perceived urgency, and two visual stimuli with high and low perceived urgency were selected based on a subjective evaluation test using a seven-point scale. A priming task was conducted to examine the cross-modal interaction of perceived urgency. Auditory stimuli were presented as prime stimuli, and the participants were asked to discriminate the visual target as quickly as possible. The results revealed that auditory stimuli with high and low perceived urgency facilitated responses to each visual stimulus with similar perceived urgency relative to each visual stimulus with different perceived urgency. The auditory stimulus with medium perceived urgency also facilitated responses to the visual stimulus with high perceived urgency relative to low. The present study shows that cross-modal correspondences can be observed when the stimuli are selected based on their subjective perceived urgency.
Purchase
Buy instant access (PDF download and unlimited online access):
Institutional Login
Log in with Open Athens, Shibboleth, or your institutional credentials
Personal login
Log in with your brill.com account
Adeli, M., Rouat, J. and Molotchnikoff, S. (2014). Audiovisual correspondence between musical timbre and visual shapes, Front. Hum. Neurosci. 8, 352. DOI:10.3389/fnhum.2014.00352.
Arrabito, G. R., Mondor, T. A. and Kent, K. J. (2004). Judging the urgency of non-verbal auditory alarms: a case study, Ergonomics 47, 821–840. DOI:10.1080/0014013042000193282.
Baguley, T. (2012). Calculating and graphing within-subject confidence intervals for ANOVA, Behav. Res. Meth. 44, 158–175. DOI:10.3758/s13428-011-0123-7.
Baldwin, C. L. and Lewis, B. A. (2014). Perceived urgency mapping across modalities within a driving context, Appl. Ergon. 45, 1270–1277. DOI:10.1016/j.apergo.2013.05.002.
Baldwin, C. L., Eisert, J. L., Garcia, A., Lewis, B., Pratt, S. M. and Gonzalez, C. (2012). Multimodal urgency coding: auditory, visual, and tactile parameters and their impact on perceived urgency, Work 41, 3586–3591. DOI:10.3233/wor-2012-0669-3586.
Bazilinskyy, P., Petermeijer, S. M., Petrovych, V., Dodou, D. and de Winter, J. C. F. (2018). Take-over requests in highly automated driving: a crowdsourcing survey on auditory, vibrotactile, and visual displays, Transp. Res. F Traffic Psychol. Behav. 56, 82–98. DOI:10.1016/j.trf.2018.04.001.
Bhattacharya, J. and Lindsen, J. P. (2016). Music for a brighter world: brightness judgment bias by musical emotion, Plos ONE 11, e0148959. DOI:10.1371/journal.pone.0148959.
Biondi, F., Strayer, D. L., Rossi, R., Gastaldi, M. and Mulatti, C. (2017). Advanced driver assistance systems: using multimodal redundant warnings to enhance road safety, Appl. Ergon. 58, 238–244. DOI:10.1016/j.apergo.2016.06.016.
Brainard, D. H. (1997). The psychophysics toolbox, Spat. Vis. 10, 433–436. DOI:10.1163/156856897x00357.
Braun, C. C. and Silver, N. C. (1995). Interaction of warning label features: determining the contributions of three warning characteristics, Proc. Hum. Factors Ergon. Soc. Annu. Meet. 39, 984–988. DOI:10.1177/154193129503901504.
Brunel, L., Carvalho, P. F. and Goldstone, R. L. (2015). It does belong together: cross-modal correspondences influence cross-modal integration during perceptual learning, Front. Psychol. 6, 358. DOI:10.3389/fpsyg.2015.00358.
Burt, J. L., Bartolome, D. S., Burdette, D. W. and Comstock, J. R., Jr. (1995). A psychophysiological evaluation of the perceived urgency of auditory warning signals, Ergonomics 38, 2327–2340. DOI:10.1080/00140139508925271.
Cabral, J. P. and Remijn, G. B. (2019). Auditory icons: design and physical characteristics, Appl. Ergon. 78, 224–239. DOI:10.1016/j.apergo.2019.02.008.
Catchpole, K. R., McKeown, J. D. and Withington, D. J. (2004). Localizable auditory warning pulses, Ergonomics 47, 748–771. DOI:10.1080/00140130310001629739.
Chapanis, A. (1994). Hazards associated with three signal words and four colours on warning signs, Ergonomics 37, 265–275. DOI:10.1080/00140139408963644.
Chen, J., Šabić, E., Mishler, S., Parker, C. and Yamaguchi, M. (2020). Effectiveness of lateral auditory collision warnings: should warnings be toward danger or toward safety?, Hum. Factors 64, 418–435. DOI:10.1177/0018720820941618.
Cohen, N. E. (1934). Equivalence of brightnesses across modalities, Am. J. Psychol. 46, 117–119. DOI:10.2307/1416240.
Edworthy, J., Loxley, S. and Dennis, I. (1991). Improving auditory warning design: relationship between warning sound parameters and perceived urgency, Hum. Factors 33, 205–231. DOI:10.1177/001872089103300206.
Evans, K. K. and Treisman, A. (2010). Natural cross-modal mappings between visual and auditory features, J. Vis. 10, 1–12. DOI:10.1167/10.1.6.
Feng, F., Li, P. and Stockman, T. (2021). Exploring crossmodal perceptual enhancement and integration in a sequence-reproducing task with cognitive priming, J. Multimod. User Interfaces 15, 45–59. DOI:10.1007/s12193-020-00326-y.
Gallace, A. and Spence, C. (2006). Multisensory synesthetic interactions in the speeded classification of visual size, Percept. Psychophys. 68, 1191–1203. DOI:10.3758/bf03193720.
Gaver, W. W. (1986). Auditory icons: using sound in computer interfaces, Hum. Comput. Interact. 2, 167–177. DOI:10.1207/s15327051hci0202_3.
Geitner, C., Biondi, F., Skrypchuk, L., Skrypchuk, L., Jennings, P. and Birrell, S. (2019). The comparison of auditory, tactile, and multimodal warnings for the effective communication of unexpected events during an automated driving scenario, Transp. Res. F Traffic Psychol. Behav. 65, 23–33. DOI:10.1016/j.trf.2019.06.011.
Gerdes, A. B. M., Wieser, M. J. and Alpers, G. W. (2014). Emotional pictures and sounds: a review of multimodal interactions of emotion cues in multiple domains, Front. Psychol. 5, 1351. DOI:10.3389/fpsyg.2014.01351.
Green, J. J., Pierce, A. M. and Mac Adams, S. L. (2019). Multisensory integration is modulated by auditory sound frequency and visual spatial frequency, Multisens. Res. 32, 589–611. DOI:10.1163/22134808-20191402.
Guillaume, A., Pellieux, L., Chastres, V. and Drake, C. (2003). Judging the urgency of nonvocal auditory warning signals: perceptual and cognitive processes, J. Exp. Psychol. Appl. 9, 196–212. DOI:10.1037/1076-898x.9.3.196.
Haas, E. C. and Casali, J. G. (1995). Perceived urgency of and response time to multi-tone and frequency-modulated warning signals in broadband noise, Ergonomics 38, 2313–2326. DOI:10.1080/00140139508925270.
Haas, E. C. and Edworthy, J. (1996). Designing urgency into auditory warnings using pitch, speed and loudness, Comput. Control Eng. J. 7, 193–198. DOI:10.1049/cce:19960407.
Hellier, E. and Edworthy, J. (1999). On using psychophysical techniques to achieve urgency mapping in auditory warnings, Appl. Ergon. 30, 167–171. DOI:10.1016/s0003-6870(97)00013-6.
Hellier, E. J., Edworthy, J. and Dennis, I. (1993). Improving auditory warning design: quantifying and predicting the effects of different warning parameters on perceived urgency, Hum. Factors. 35, 693–706. DOI:10.1177/001872089303500408.
Ho, C. and Spence, C. (2005). Assessing the effectiveness of various auditory cues in capturing a driver’s visual attention, J. Exp. Psychol. Appl. 11, 157–174. DOI:10.1037/1076-898x.11.3.157.
Ho, C. and Spence, C. (2009). Using peripersonal warning signals to orient a driver’s gaze, Hum. Factors 51, 539–556. DOI:10.1177/0018720809341735.
Ho, C., Reed, N. and Spence, C. (2006a). Assessing the effectiveness of “intuitive” vibrotactile warning signals in preventing front-to-rear-end collisions in a driving simulator, Accid. Anal. Prev. 38, 988–996. DOI:10.1016/j.aap.2006.04.002.
Ho, C., Tan, H. Z. and Spence, C. (2006b). The differential effect of vibrotactile and auditory cues on visual spatial attention, Ergonomics 49, 724–738. DOI:10.1080/00140130600589887.
Ho, C., Reed, N. and Spence, C. (2007). Multisensory in-car warning signals for collision avoidance, Hum. Factors 49, 1107–1114. DOI:10.1518/001872007x249965.
Ho, C., Santangelo, V. and Spence, C. (2009). Multisensory warning signals: when spatial correspondence matters, Exp. Brain Res. 195, 261–272. DOI:10.1007/s00221-009-1778-5.
Ho, H.-N., Van Doorn, G. H., Kawabe, T., Watanabe, J. and Spence, C. (2014). Colour–temperature correspondences: when reactions to thermal stimuli are influenced by colour, Plos ONE 9, e91854. DOI:10.1371/journal.pone.0091854.
Huang, G. and Pitts, B. J. (2022). Takeover requests for automated driving: the effects of signal direction, lead time, and modality on takeover performance, Accid. Anal. Prev. 165, 106534. DOI:10.1016/j.aap.2021.106534.
Isbilen, E. S. and Krumhansl, C. L. (2016). The color of music: emotion-mediated associations to Bach’s Well-Tempered Clavier, Psychomusicology 26, 149–161. DOI:10.1037/pmu0000147.
Jamal, Y., Lacey, S., Nygaard, L. and Sathian, K. (2017). Interactions between auditory elevation, auditory pitch and visual elevation during multisensory perception, Multisens. Res. 30, 287–306. DOI:10.1163/22134808-00002553.
Keyes, H., Whitmore, A., Naneva, S. and McDermott, D. (2018). The priming functionof in-car audio instruction, Q. J. Exp. Psychol. 72, 643–650. DOI:10.1177/1747021818773293.
Kleiner, M., Brainard, D., Pelli, D., Ingling, A., Murray, R. and Broussard, C. (2007). What’s new in Psychtoolbox-3?, Perception 36, 1–16.
Lee, J. and Spence, C. (2015). Audiovisual crossmodal cuing effects in front and rear space, Front. Psychol. 6, 1086. DOI:10.3389/fpsyg.2015.01086.
Lee, J. and Spence, C. (2017). On the spatial specificity of audio–visual crossmodal exogenous cuing effects, Acta Psychol. 177, 78–88. DOI:10.1016/j.actpsy.2017.04.012.
Lees, M. N., Cosman, J., Lee, J. D., Vecera, S. P., Dawson, J. D. and Rizzo, M. (2012). Cross-modal warnings for orienting attention in older drivers with and without attention impairments, Appl. Ergon. 43, 768–776. DOI:10.1016/j.apergo.2011.11.012.
Lindborg, P. and Friberg, A. K. (2015). Colour association with music is mediated by emotion: evidence from an experiment using a CIE Lab interface and interviews, Plos ONE 10, e0144013. DOI:10.1371/journal.pone.0144013.
Lundqvist, L.-M. and Eriksson, L. (2019). Age, cognitive load, and multimodal effects on driver response to directional warning, Appl. Ergon. 76, 147–154. DOI:10.1016/j.apergo.2019.01.002.
Marks, L. E. (1987). On cross-modal similarity: auditory-visual interactions in speeded discrimination, J. Exp. Psychol. Hum. Percept. Perform. 13, 384–394. DOI:10.1037/0096-1523.13.3.384.
Mauss, I. B. and Robinson, M. D. (2009). Measures of emotion: a review, Cogn. Emot. 23, 209–237. DOI:10.1080/02699930802204677.
McKeown, D. and Isherwood, S. (2007). Mapping candidate within-vehicle auditory displays to their referents, Hum. Factors 49, 417–428. DOI:10.1518/001872007x200067.
Mellers, B. A. and Birnbaum, M. H. (1982). Loci of contextual effects in judgment, J. Exp. Psychol. Hum. Percept. Perform. 8, 582–601. DOI:10.1037/0096-1523.8.4.582.
Palmer, S. E., Schloss, K. B., Xu, Z. and Prado-León, L. R. (2013). Music–color associations are mediated by emotion, Proc. Natl. Acad. Sci. USA 110, 8836–8841. DOI:10.1073/pnas.1212562110.
Palmer, S. E., Langlois, T. A. and Schloss, K. B. (2016). Music-to-color associations of single-line piano melodies in non-synesthetes, Multisens. Res. 29, 157–193. DOI:10.1163/22134808-00002486.
Parise, C. V. and Spence, C. (2012). Audiovisual crossmodal correspondences and sound symbolism: a study using the implicit association test, Exp. Brain. Res. 220, 319–333. DOI:10.1007/s00221-012-3140-6.
Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: transforming numbers into movies, Spat. Vis. 10, 437–442. DOI:10.1163/156856897x00366.
Postnova, N., Nakajima, Y., Ueda, K. and Remijn, G. B. (2020). Perceived congruency in audiovisual stimuli consisting of Gabor patches and AM and FM tones, Multisens. Res. 34, 455–475. DOI:10.1163/22134808-bja10041.
Russell, J. A., Weiss, A. and Mendelsohn, G. A. (1989). Affect grid: a single-item scale of pleasure and arousal, J. Pers. Soc. Psychol. 57, 493–502. DOI:10.1037/00223514.57.3.493.
Schwarz, F. and Fastenmeier, W. (2017). Augmented reality warnings in vehicles: effects of modality and specificity on effectiveness, Accid. Anal. Prev. 101, 55–66. DOI:10.1016/j.aap.2017.01.019.
Spence, C. (2011). Crossmodal correspondences: a tutorial review, Atten. Percept. Psychophys. 73, 971–995. DOI:10.3758/s13414-010-0073-7.
Spence, C. (2020a). Simple and complex crossmodal correspondences involving audition, Acoust. Sci. Technol. 41, 6–12. DOI:10.1250/ast.41.6.
Spence, C. (2020b). Assessing the role of emotional mediation in explaining crossmodal correspondences involving musical stimuli, Multisens. Res. 33, 1–29. DOI:10.1163/22134808-20191469.
Spence, C. and Di Stefano, N. (2022). Coloured hearing, colour music, colour organs, and the search for perceptually meaningful correspondences between colour and sound, iPerception. 13, 20416695221092802. DOI:10.1177/20416695221092802.
Spence, C. and Ho, C. (2008). Multisensory interface design for drivers: past, present and future, Ergonomics 51, 65–70. DOI:10.1080/00140130701802759.
Stanton, N. A. and Edworthy, J. (1999). Warnings and displays: an overview, in: Human Factors in Auditory Warnings, N. A. Stanton and J. Edworthy (Eds), pp. 3–30. Routledge, London, UK.
Suied, C., Susini, P. and McAdams, S. (2008). Evaluating warning sound urgency with reaction times, J. Exp. Psychol. Appl. 14, 201–212. DOI:10.1037/1076-898x.14.3.201.
Sun, X., Li, X., Ji, L., Han, F., Wang, H., Liu, Y., Chen, Y., Lou, Z. and Li, Z. (2018). An extended research of crossmodal correspondence between color and sound in psychology and cognitive ergonomics, PeerJ 6, e4443. DOI:10.7717/peerj.4443.
Walker, P. (2012). Cross-sensory correspondences and cross talk between dimensions of connotative meaning: visual angularity is hard, high-pitched, and bright, Atten. Percept. Psychophys. 74, 1792–1809. DOI:10.3758/s13414-012-0341-9.
Wang, Y., Wu, B., Ma, S., Wang, D., Gan, T., Liu, H. and Yang, Z. (2022). Effect of mapping characteristic on audiovisual warning: evidence from a simulated driving study, Appl. Ergon. 99, 103638. DOI:10.1016/j.apergo.2021.103638.
Wiese, E. E. and Lee, J. D. (2004). Auditory alerts for in-vehicle information systems: the effects of temporal conflict and sound parameters on driver attitudes and performance, Ergonomics 47, 965–986. DOI:10.1080/00140130410001686294.
Winkler, S., Kazazi, J. and Vollrath, M. (2018). How to warn drivers in various safety-critical situations — different strategies, different reactions, Accid. Anal. Prev. 117, 410–426. DOI:10.1016/j.aap.2018.01.040.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 871 | 314 | 31 |
Full Text Views | 340 | 40 | 0 |
PDF Views & Downloads | 456 | 84 | 0 |
When presenting information in vehicle cockpits, it is essential to convey an appropriate urgency to the drivers. Perceived urgency has been investigated over the years for each modality, particularly audition and vision. However, the interaction between the modalities of perceived urgency has rarely been examined. To expand the insight into the design application of information presentation, we investigated the audio–visual interaction of perceived urgency using a priming task that involved speeded visual-target discrimination. A total of 60 auditory stimuli were created using a synthesizer. In addition, 13 color squares were used as visual stimuli. Three auditory stimuli with high, medium, and low perceived urgency, and two visual stimuli with high and low perceived urgency were selected based on a subjective evaluation test using a seven-point scale. A priming task was conducted to examine the cross-modal interaction of perceived urgency. Auditory stimuli were presented as prime stimuli, and the participants were asked to discriminate the visual target as quickly as possible. The results revealed that auditory stimuli with high and low perceived urgency facilitated responses to each visual stimulus with similar perceived urgency relative to each visual stimulus with different perceived urgency. The auditory stimulus with medium perceived urgency also facilitated responses to the visual stimulus with high perceived urgency relative to low. The present study shows that cross-modal correspondences can be observed when the stimuli are selected based on their subjective perceived urgency.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 871 | 314 | 31 |
Full Text Views | 340 | 40 | 0 |
PDF Views & Downloads | 456 | 84 | 0 |