Search Results

You are looking at 1 - 10 of 18 items for

  • Author or Editor: Michael Barnett-Cowan x
  • Search level: All x
Clear All

Multisensory stimuli originating from the same event can be perceived asynchronously due to differential physical and neural delays. The transduction of and physiological responses to vestibular stimulation are extremely fast, suggesting that other stimuli need to be presented prior to vestibular stimulation in order to be perceived as simultaneous. There is, however, a recent and growing body of evidence which indicates that the perceived onset of vestibular stimulation is slow compared to the other senses, such that vestibular stimuli need to be presented prior to other sensory stimuli in order to be perceived synchronously. Following a review of this literature I will argue that this perceived latency of vestibular stimulation likely reflects the fact that vestibular stimulation is most often associated with sensory events that occur following head movement, that the vestibular system rarely works alone, and that the brain prioritizes physiological response to vestibular stimulation over perceptual awareness of stimulation onset.

In: Seeing and Perceiving

Multisensory stimuli originating from the same event can be perceived asynchronously due to differential physical and neural delays. The transduction of and physiological responses to vestibular stimulation are extremely fast, suggesting that other stimuli need to be presented prior to vestibular stimulation in order to be perceived as simultaneous. There is, however, a recent and growing body of evidence which indicates that the perceived onset of vestibular stimulation is slow compared to the other senses, such that vestibular stimuli need to be presented prior to other sensory stimuli in order to be perceived synchronously. From a review of this literature it is speculated that this perceived latency of vestibular stimulation may reflect the fact that vestibular stimulation is most often associated with sensory events that occur following head movement, that the vestibular system rarely works alone, that additional computations are required for processing vestibular information, and that the brain prioritizes physiological response to vestibular stimulation over perceptual awareness of stimulation onset. Empirical investigation of these theoretical predictions is encouraged in order to fully understand this surprising result, its implications, and to advance the field.

In: Multisensory Research

Abstract

Humans are constantly presented with rich sensory information that the central nervous system (CNS) must process to form a coherent perception of the self and its relation to its surroundings. While the CNS is efficient in processing multisensory information in natural environments, virtual reality (VR) poses challenges of temporal discrepancies that the CNS must solve. These temporal discrepancies between information from different sensory modalities leads to inconsistencies in perception of the virtual environment which often causes cybersickness. Here, we investigate whether individual differences in the perceived relative timing of sensory events, specifically parameters of temporal-order judgement (TOJ), can predict cybersickness. Study 1 examined audiovisual (AV) TOJs while Study 2 examined audio-active head movement (AAHM) TOJs. We deduced metrics of the temporal binding window (TBW) and point of subjective simultaneity (PSS) for a total of 50 participants. Cybersickness was quantified using the Simulator Sickness Questionnaire (SSQ). Study 1 results (correlations and multiple regression) show that the oculomotor SSQ shares a significant yet positive correlation with AV PSS and TBW. While there is a positive correlation between the total SSQ scores and the TBW and PSS, these correlations are not significant. Therefore, although these results are promising, we did not find the same effect for AAHM TBW and PSS. We conclude that AV TOJ may serve as a potential tool to predict cybersickness in VR. Such findings will generate a better understanding of cybersickness which can be used for development of VR to help mitigate discomfort and maximize adoption.

In: Multisensory Research

Abstract

Integration of incoming sensory signals from multiple modalities is central in the determination of self-motion perception. With the emergence of consumer virtual reality (VR), it is becoming increasingly common to experience a mismatch in sensory feedback regarding motion when using immersive displays. In this study, we explored whether introducing various discrepancies between the vestibular and visual motion would influence the perceived timing of self-motion. Participants performed a series of temporal-order judgements between an auditory tone and a passive whole-body rotation on a motion platform accompanied by visual feedback using a virtual environment generated through a head-mounted display. Sensory conflict was induced by altering the speed and direction by which the movement of the visual scene updated relative to the observer’s physical rotation. There were no differences in perceived timing of the rotation without vision, with congruent visual feedback and when the speed of the updating of the visual motion was slower. However, the perceived timing was significantly further from zero when the direction of the visual motion was incongruent with the rotation. These findings demonstrate the potential interaction between visual and vestibular signals in the temporal perception of self-motion. Additionally, we recorded cybersickness ratings and found that sickness severity was significantly greater when visual motion was present and incongruent with the physical motion. This supports previous research regarding cybersickness and the sensory conflict theory, where a mismatch between the visual and vestibular signals may lead to a greater likelihood for the occurrence of sickness symptoms.

In: Multisensory Research

Abstract

A single bout of aerobic exercise is related to positive changes in higher-order cognitive function among older adults; however, the impact of aerobic exercise on multisensory processing remains unclear. Here we assessed the effects of a single bout of aerobic exercise on commonly utilized tasks that measure audiovisual multisensory processing: response time (RT), simultaneity judgements (SJ), and temporal-order judgements (TOJ), in a pilot study. To our knowledge this is the first effort to investigate the effects of three well-controlled intervention conditions on multisensory processing: resting, completing a cognitively demanding task, and performing aerobic exercise for 20 minutes. Our results indicate that the window of time within which stimuli from different modalities are integrated and perceived as simultaneous (temporal binding window; TBW) is malleable and changes after each intervention condition for both the SJ and TOJ tasks. Specifically, the TBW consistently became narrower post exercise while consistently increasing in width post rest, suggesting that aerobic exercise may improve temporal perception precision via broad neural change rather than targeting the specific networks that subserve either the SJ or TOJ tasks individually. The results from the RT task further support our findings of malleability of the multisensory processing system, as changes in performance, as assessed through cumulative probability models, were observed after each intervention condition. An increase in integration (i.e., greater magnitude of multisensory effect) however, was only found after a single bout of aerobic exercise. Overall, our results indicate that exercise uniquely affects the central nervous system and may broadly affect multisensory processing.

In: Multisensory Research

The perception of simultaneity between auditory and vestibular information is crucially important for maintaining a coherent representation of the acoustic environment whenever the head moves. Yet, despite similar transduction latencies, vestibular stimuli are perceived significantly later than auditory stimuli when simultaneously generated (Barnett-Cowan and Harris, 2009, 2011). However, these studies paired a vestibular stimulation of long duration (∼1 s) and of a continuously changing temporal envelope with brief (10–50 ms) sound pulses. In the present study the stimuli were matched for temporal envelope. Participants judged the temporal order of the onset of an active head movement and of brief (50 ms) or long (1400 ms) sounds with a square or raised-cosine shaped envelope. Consistent with previous reports, head movement onset had to precede the onset of a brief sound by about 73 ms in order to be perceived as simultaneous. Head movements paired with long square sounds (∼100 ms) were not significantly different than brief sounds. Surprisingly, head movements paired with long raised-cosine sound (∼115 ms) had to be presented even earlier than brief stimuli. This additional lead time could not be accounted for by differences in the comparison stimulus characteristics (duration and temporal envelope). Rather, differences among sound conditions were found to be attributable to variability in the time for head movement to reach peak velocity: the head moved faster when paired with a brief sound. The persistent lead time required for vestibular stimulation provides further evidence that the perceptual latency of vestibular stimulation is larger compared to auditory stimuli.

In: Seeing and Perceiving

Abstract

The integration of vestibular, visual and body cues is a fundamental process in the perception of self-motion and is commonly experienced in an upright posture. However, when the body is tilted in an off-vertical orientation these signals are no longer aligned relative to the influence of gravity. In this study, the perceived timing of visual motion was examined in the presence of sensory conflict introduced by manipulating the orientation of the body, generating a mismatch between body and vestibular cues due to gravity and creating an ambiguous vestibular signal of either head tilt or translation. In a series of temporal-order judgment tasks, participants reported the perceived onset of a visual scene simulating rotation around the yaw axis presented in virtual reality with a paired auditory tone while in an upright, supine and side-recumbent body position. The results revealed that the perceived onset of visual motion was further delayed from zero (i.e., true simultaneity between visual onset and a reference auditory tone) by approximately an additional 30 ms when viewed in a supine or side-recumbent orientation compared to an upright posture. There were also no significant differences in the timing estimates of the visual motion between all the non-upright orientations. This indicates that the perceived timing of visual motion is negatively impacted by the presence of conflict in the vestibular and body signals due to the direction of gravity and body orientation, even when the mismatch is not in the direct plane of the axis of rotation.

In: Multisensory Research

Abstract

Previous studies have found that semantics, the higher-level meaning of stimuli, can impact multisensory integration; however, less is known about the effect of valence, an affective response to stimuli. This study investigated the effects of both semantic congruency and valence of non-speech audiovisual stimuli on multisensory integration via response time (RT) and temporal-order judgement (TOJ) tasks [assessing processing speed (RT), Point of Subjective Simultaneity (PSS), and time window when multisensory stimuli are likely to be perceived as simultaneous (temporal binding window; TBW)]. Through an online study with 40 participants (mean age: 26.25 years; females = 17), we found that both congruence and valence had a significant main effect on RT (congruency and positive valence decrease RT) and an interaction effect (congruent/positive valence condition being significantly faster than all others). For TOJ, there was a significant main effect of valence and a significant interaction effect where positive valence (compared to negative valence) and the congruent/positive condition (compared to all other conditions) required visual stimuli to be presented significantly earlier than auditory stimuli to be perceived as simultaneous. A subsequent analysis showed a positive correlation between TBW width and RT (as TBW widens, RT increases) for the categories that were furthest from true simultaneity in their PSS (Congruent/Positive and Incongruent/Negative). This study provides new evidence that supports previous research on semantic congruency and presents a novel incorporation of valence into behavioural responses.

In: Multisensory Research