Introduction: In multistable perception, the brain alternates between several perceptual explanations of ambiguous sensory signals. Recent studies have demonstrated crossmodal interactions between ambiguous and unambiguous signals. However it is currently unknown whether multiple bistable processes can interact across the senses (Conrad et al., ; Pressnitzer and Hupe, ). Using the apparent motion quartet in vision and touch, this study investigated whether bistable perceptual processes for vision and touch are independent or influence each other when powerful cues of congruency are provided to facilitate visuotactile integration (Conrad et al., in press).

Methods: When two visual flashes and/or tactile vibration pulses are presented alternately along the two diagonals of the rectangle, subjects’ percept vacillates between vertical and horizontal apparent motion in the visual and/or tactile modalities (Carter et al., ). Observers were presented with unisensory (visual/tactile), visuotactile spatially congruent and incongruent apparent motion quartets and reported their visual or tactile percepts.

Results: Congruent stimulation induced pronounced visuotactile interactions as indicated by increased dominance times and %-bias for the percept already dominant under unisensory stimulation. Yet, the temporal dynamics did not converge for congruent stimulation. It depended also on subjects’ attentional focus and was generally slower for tactile than visual reports.

Conclusion: Our results support Bayesian approaches to perceptual inference, where the probability of a perceptual interpretation is determined by combining a modality-specific prior with incoming visual and/or tactile evidence. Under congruent stimulation, joint evidence from both senses decelerates the rivalry dynamics by stabilizing the more likely perceptual interpretation. Importantly, the perceptual stabilization was specific to spatiotemporally congruent visuotactile stimulation indicating multisensory rather than cognitive bias mechanisms.

In: Seeing and Perceiving

The brain should integrate sensory inputs only when they emanate from a common source and segregate those from different sources. Sensory correspondences are important cues informing the brain whether two sensory inputs are generated by a common event and should hence be integrated. Most prominently, sensory inputs should co-occur in time and space. More complex audiovisual stimuli may also be congruent in terms of semantics (e.g., objects and source sounds) or phonology (e.g., spoken and written words; linked via common linguistic labels). Surprisingly, metaphoric relations (e.g., pitch and height) have also been shown to influence audiovisual integration. The neural mechanisms that mediate these metaphoric congruency effects are only poorly understood. They may be mediated via (i) natural multisensory binding, (ii) common linguistic labels or (iii) semantics. In this talk, we will present a series of studies that investigate whether these different types of audiovisual correspondences are processed by distinct neural systems. Further, we investigate how those systems are employed by metaphoric audiovisual correspondences. Our results demonstrate that different classes of audiovisual correspondences influence multisensory integration at distinct levels of the cortical hierarchy. Spatiotemporal incongruency is detected already at the primary cortical level. Natural (e.g., motion direction) and phonological incongruency influences MSI in areas involved in motion or phonological processing. Critically, metaphoric interactions emerge in neural systems that are shared with natural and semantic incongruency. This activation pattern may reflect the ambivalent nature of metaphoric audiovisual interactions relying on both natural and semantic correspondences.

In: Seeing and Perceiving