Search Results

You are looking at 1 - 10 of 18 items for

  • Author or Editor: Michael Barnett-Cowan x
  • Search level: All x
Clear All

Multisensory stimuli originating from the same event can be perceived asynchronously due to differential physical and neural delays. The transduction of and physiological responses to vestibular stimulation are extremely fast, suggesting that other stimuli need to be presented prior to vestibular stimulation in order to be perceived as simultaneous. There is, however, a recent and growing body of evidence which indicates that the perceived onset of vestibular stimulation is slow compared to the other senses, such that vestibular stimuli need to be presented prior to other sensory stimuli in order to be perceived synchronously. Following a review of this literature I will argue that this perceived latency of vestibular stimulation likely reflects the fact that vestibular stimulation is most often associated with sensory events that occur following head movement, that the vestibular system rarely works alone, and that the brain prioritizes physiological response to vestibular stimulation over perceptual awareness of stimulation onset.

In: Seeing and Perceiving

Multisensory stimuli originating from the same event can be perceived asynchronously due to differential physical and neural delays. The transduction of and physiological responses to vestibular stimulation are extremely fast, suggesting that other stimuli need to be presented prior to vestibular stimulation in order to be perceived as simultaneous. There is, however, a recent and growing body of evidence which indicates that the perceived onset of vestibular stimulation is slow compared to the other senses, such that vestibular stimuli need to be presented prior to other sensory stimuli in order to be perceived synchronously. From a review of this literature it is speculated that this perceived latency of vestibular stimulation may reflect the fact that vestibular stimulation is most often associated with sensory events that occur following head movement, that the vestibular system rarely works alone, that additional computations are required for processing vestibular information, and that the brain prioritizes physiological response to vestibular stimulation over perceptual awareness of stimulation onset. Empirical investigation of these theoretical predictions is encouraged in order to fully understand this surprising result, its implications, and to advance the field.

In: Multisensory Research

Abstract

Humans are constantly presented with rich sensory information that the central nervous system (CNS) must process to form a coherent perception of the self and its relation to its surroundings. While the CNS is efficient in processing multisensory information in natural environments, virtual reality (VR) poses challenges of temporal discrepancies that the CNS must solve. These temporal discrepancies between information from different sensory modalities leads to inconsistencies in perception of the virtual environment which often causes cybersickness. Here, we investigate whether individual differences in the perceived relative timing of sensory events, specifically parameters of temporal-order judgement (TOJ), can predict cybersickness. Study 1 examined audiovisual (AV) TOJs while Study 2 examined audio-active head movement (AAHM) TOJs. We deduced metrics of the temporal binding window (TBW) and point of subjective simultaneity (PSS) for a total of 50 participants. Cybersickness was quantified using the Simulator Sickness Questionnaire (SSQ). Study 1 results (correlations and multiple regression) show that the oculomotor SSQ shares a significant yet positive correlation with AV PSS and TBW. While there is a positive correlation between the total SSQ scores and the TBW and PSS, these correlations are not significant. Therefore, although these results are promising, we did not find the same effect for AAHM TBW and PSS. We conclude that AV TOJ may serve as a potential tool to predict cybersickness in VR. Such findings will generate a better understanding of cybersickness which can be used for development of VR to help mitigate discomfort and maximize adoption.

In: Multisensory Research

Abstract

Integration of incoming sensory signals from multiple modalities is central in the determination of self-motion perception. With the emergence of consumer virtual reality (VR), it is becoming increasingly common to experience a mismatch in sensory feedback regarding motion when using immersive displays. In this study, we explored whether introducing various discrepancies between the vestibular and visual motion would influence the perceived timing of self-motion. Participants performed a series of temporal-order judgements between an auditory tone and a passive whole-body rotation on a motion platform accompanied by visual feedback using a virtual environment generated through a head-mounted display. Sensory conflict was induced by altering the speed and direction by which the movement of the visual scene updated relative to the observer’s physical rotation. There were no differences in perceived timing of the rotation without vision, with congruent visual feedback and when the speed of the updating of the visual motion was slower. However, the perceived timing was significantly further from zero when the direction of the visual motion was incongruent with the rotation. These findings demonstrate the potential interaction between visual and vestibular signals in the temporal perception of self-motion. Additionally, we recorded cybersickness ratings and found that sickness severity was significantly greater when visual motion was present and incongruent with the physical motion. This supports previous research regarding cybersickness and the sensory conflict theory, where a mismatch between the visual and vestibular signals may lead to a greater likelihood for the occurrence of sickness symptoms.

In: Multisensory Research

Abstract

Previous studies have found that semantics, the higher-level meaning of stimuli, can impact multisensory integration; however, less is known about the effect of valence, an affective response to stimuli. This study investigated the effects of both semantic congruency and valence of non-speech audiovisual stimuli on multisensory integration via response time (RT) and temporal-order judgement (TOJ) tasks [assessing processing speed (RT), Point of Subjective Simultaneity (PSS), and time window when multisensory stimuli are likely to be perceived as simultaneous (temporal binding window; TBW)]. Through an online study with 40 participants (mean age: 26.25 years; females = 17), we found that both congruence and valence had a significant main effect on RT (congruency and positive valence decrease RT) and an interaction effect (congruent/positive valence condition being significantly faster than all others). For TOJ, there was a significant main effect of valence and a significant interaction effect where positive valence (compared to negative valence) and the congruent/positive condition (compared to all other conditions) required visual stimuli to be presented significantly earlier than auditory stimuli to be perceived as simultaneous. A subsequent analysis showed a positive correlation between TBW width and RT (as TBW widens, RT increases) for the categories that were furthest from true simultaneity in their PSS (Congruent/Positive and Incongruent/Negative). This study provides new evidence that supports previous research on semantic congruency and presents a novel incorporation of valence into behavioural responses.

Full Access
In: Multisensory Research

The perception of simultaneity between auditory and vestibular information is crucially important for maintaining a coherent representation of the acoustic environment whenever the head moves. Yet, despite similar transduction latencies, vestibular stimuli are perceived significantly later than auditory stimuli when simultaneously generated (Barnett-Cowan and Harris, , ). However, these studies paired a vestibular stimulation of long duration (∼1 s) and of a continuously changing temporal envelope with brief (10–50 ms) sound pulses. In the present study the stimuli were matched for temporal envelope. Participants judged the temporal order of the onset of an active head movement and of brief (50 ms) or long (1400 ms) sounds with a square or raised-cosine shaped envelope. Consistent with previous reports, head movement onset had to precede the onset of a brief sound by about 73 ms in order to be perceived as simultaneous. Head movements paired with long square sounds (∼100 ms) were not significantly different than brief sounds. Surprisingly, head movements paired with long raised-cosine sound (∼115 ms) had to be presented even earlier than brief stimuli. This additional lead time could not be accounted for by differences in the comparison stimulus characteristics (duration and temporal envelope). Rather, differences among sound conditions were found to be attributable to variability in the time for head movement to reach peak velocity: the head moved faster when paired with a brief sound. The persistent lead time required for vestibular stimulation provides further evidence that the perceptual latency of vestibular stimulation is larger compared to auditory stimuli.

In: Seeing and Perceiving

The orientation at which objects are most easily recognized — the perceptual upright (PU) — is influenced by body orientation with respect to gravity. To date, the influence of these cues on object recognition has only been measured within the visual system. Here we investigate whether objects explored through touch alone are similarly influenced by body and gravitational information. Using the Oriented CHAracter Recognition Test (OCHART) adapted for haptics, blindfolded right-handed observers indicated whether the symbol ‘p’ presented in various orientations was the letter ‘p’ or ‘d’ following active touch. The average of ‘p-to-d’ and ‘d-to-p’ transitions was taken as the haptic PU. Sensory information was manipulated by positioning observers in different orientations relative to gravity with the head, body, and hand aligned. Results show that haptic object recognition is equally influenced by body and gravitational references frames, but with a constant leftward bias. This leftward bias in the haptic PU resembles leftward biases reported for visual object recognition. The influence of body orientation and gravity on the haptic PU was well predicted by an equally weighted vectorial sum of the directions indicated by these cues. Our results demonstrate that information from different reference frames influence the perceptual upright in haptic object recognition. Taken together with similar investigations in vision, our findings suggest that reliance on body and gravitational frames of reference helps maintain optimal object recognition. Equally relying on body and gravitational information may facilitate haptic exploration with an upright posture, while compensating for poor vestibular sensitivity when tilted.

In: Seeing and Perceiving

The restricted operational space of dynamic driving simulators requires the implementation of motion cueing algorithms that tilt the simulator cabin to reproduce sustained accelerations. In order to avoid conflicting inertial cues, the tilt rate is limited below drivers’ perceptual thresholds, which are typically derived from the results of classical vestibular research, where additional sensory cues to self-motion are removed. These limits might be too conservative for an ecological driving simulation, which provides a variety of complex visual and vestibular cues as well as demands of attention which vary with task difficulty. We measured roll rate detection threshold in active driving simulation, where visual and vestibular stimuli are provided as well as increased cognitive load from the driving task. Here thresholds during active driving are compared with tilt rate detection thresholds found in the literature (passive thresholds) to assess the effect of the driving task. In a second experiment, these thresholds (active versus passive) are related to driving preferences in a slalom driving course in order to determine which roll rate values are most appropriate for driving simulators so as to present the most realistic driving experience. The results show that detection threshold for roll in an active driving task is significantly higher than the limits currently used in motion cueing algorithms, suggesting that higher tilt limits can be successfully implemented to better optimize simulator operational space. Supra-threshold roll rates in the slalom task are also rated as more realistic. Overall, our findings indicate that increasing task complexity in driving simulation can decrease motion sensitivity allowing for further expansion of the virtual workspace environment.

In: Seeing and Perceiving

Abstract

A single bout of aerobic exercise is related to positive changes in higher-order cognitive function among older adults; however, the impact of aerobic exercise on multisensory processing remains unclear. Here we assessed the effects of a single bout of aerobic exercise on commonly utilized tasks that measure audiovisual multisensory processing: response time (RT), simultaneity judgements (SJ), and temporal-order judgements (TOJ), in a pilot study. To our knowledge this is the first effort to investigate the effects of three well-controlled intervention conditions on multisensory processing: resting, completing a cognitively demanding task, and performing aerobic exercise for 20 minutes. Our results indicate that the window of time within which stimuli from different modalities are integrated and perceived as simultaneous (temporal binding window; TBW) is malleable and changes after each intervention condition for both the SJ and TOJ tasks. Specifically, the TBW consistently became narrower post exercise while consistently increasing in width post rest, suggesting that aerobic exercise may improve temporal perception precision via broad neural change rather than targeting the specific networks that subserve either the SJ or TOJ tasks individually. The results from the RT task further support our findings of malleability of the multisensory processing system, as changes in performance, as assessed through cumulative probability models, were observed after each intervention condition. An increase in integration (i.e., greater magnitude of multisensory effect) however, was only found after a single bout of aerobic exercise. Overall, our results indicate that exercise uniquely affects the central nervous system and may broadly affect multisensory processing.

Full Access
In: Multisensory Research