Search Results

You are looking at 1 - 2 of 2 items for

  • Author or Editor: Nicolas Davidenko x
  • Search level: All x
Clear All

Abstract

Virtual-reality (VR) users and developers have informally reported that time seems to pass more quickly while playing games in VR. We refer to this phenomenon as time compression: a longer real duration is compressed into a shorter perceived experience. To investigate this effect, we created two versions of a labyrinth-like game. The versions are identical in their content and mode of control but differ in their display type: one was designed to be played in VR, and the other on a conventional monitor (CM). Participants were asked to estimate time prospectively using an interval production method. Participants played each version of the game for a perceived five-minute interval, and the actual durations of the intervals they produced were compared between display conditions. We found that in the first block, participants in the VR condition played for an average of 72.6 more seconds than participants in the CM condition before feeling that five minutes had passed. This amounts to perceived five-minute intervals in VR containing 28.5% more actual time than perceived five-minute intervals in CM. However, the effect appeared to be reversed in the second block when participants switched display conditions, suggesting large novelty and anchoring effects, and demonstrating the importance of using between-subjects designs in interval production experiments. Overall, our results suggest that VR displays do produce a significant time compression effect. We discuss a VR-induced reduction in bodily awareness as a potential explanation for how this effect is mediated and outline some implications and suggestions for follow-up experiments.

Open Access
In: Timing & Time Perception

Abstract

We propose that cross-sensory stimuli presenting a positive attributable source of an aversive sound can modulate negative reactions to the sound. In Experiment 1, participants rated original video sources (OVS) of eight aversive sounds (e.g., nails scratching a chalkboard) as more aversive than eight positive attributable video sources (PAVS) of those same sounds (e.g., someone playing a flute) when these videos were presented silently. In Experiment 2, new participants were presented with those eight aversive sounds in three blocks. In Blocks 1 and 3, the sounds were presented alone; in Block 2, four of the sounds were randomly presented concurrently with their corresponding OVS videos, and the other four with their corresponding PAVS videos. Participants rated each sound, presented with or without video, on three scales: discomfort, unpleasantness, and bodily sensations. We found the concurrent presentation of videos robustly modulates participants’ reactions to the sounds: compared to the sounds alone (Block 1), concurrent presentation of PAVS videos significantly reduced negative reactions to the sounds, and the concurrent presentation of OVS videos significantly increased negative reactions, across all three scales. These effects, however, did not linger into Block 3 when the sounds were presented alone again. Our results provide novel evidence that negative reactions to aversive sounds can be modulated through cross-sensory temporal syncing with a positive attributable video source. Although this research was conducted with a neurotypical population, we argue that our findings have implications for the treatment of misophonia.

In: Multisensory Research