Abstract
Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness. We evaluated whether a new Artificial Intelligence (AI) software designed to supplement the 360-degree VR experience with artificial six-degrees-of-freedom motion may reduce cybersickness. Explicit (simulator sickness questionnaire and Fast Motion Sickness (FMS) rating) and implicit (heart rate) measurements were used to evaluate cybersickness symptoms during and after 360-degree VR exposure. Simulator sickness scores showed a significant reduction in feelings of nausea during the AI-supplemented six-degrees-of-freedom motion VR compared to traditional 360-degree VR. However, six-degrees-of-freedom motion VR did not reduce oculomotor or disorientation measures of sickness. No changes were observed in FMS and heart rate measures. Improving the congruency between visual and vestibular cues in 360-degree VR, as provided by the AI-supplemented six-degrees-of-freedom motion system considered, is essential for a more engaging, immersive and safe VR experience, which is critical for educational, cultural and entertainment applications.
1. Introduction
As we move through the external environment, the vestibular system in the inner ear provides a flow of information regarding the position of our head in the three-dimensional space. The vestibular system is comprised of three orthogonal semicircular canals (anterior, posterior and horizontal), that respectively sense rotational acceleration of the head in space and around the cardinal yaw, roll, and pitch axes, and two otolith organs (utricle and saccule) that sense translational acceleration, including the orientation of the head relative to gravity. These signals are crucial for self-motion perception (Chen et al., 2008) confirming the essential vestibular contribution to the brain’s Global Positioning System.
Typically, vestibular inputs match perfectly with optic flow information from the images moving across the retina. The brain optimally integrates vestibular and visual signals to create a coherent perception of the direction and speed of self-motion (Butler et al., 2010; Fetsch et al., 2009; Greenlee et al., 2016). However, there are some circumstances in which visual and vestibular signals for self-motion may not match and potentially conflict. This is the case of Virtual Reality (VR). The promise of VR has always been enormous and the ability to immerse a user in a virtual environment through the use of 3D real-time computer graphics and advanced head-mounted display devices has been shown to be beneficial in a number of applications like engineering, education and entertainment. VR has grown exponentially in popularity in recent years. Improvements in resolution, latency, flicker rate and motion tracking have enhanced the VR experience. However, a troubling problem remains where up to 80% of VR users experience debilitating symptoms of discomfort, disorientation, nausea, eyestrain, headaches and sweating, a malady called cybersickness (LaViola, 2000; Rebenitsch and Owen, 2016). Cybersickness is characterised by severe and frequent disorientation symptoms (dizziness, vertigo, and difficulty in focusing), followed by nausea symptoms (stomach awareness, increased salivation, and nausea itself), and in third place oculomotor symptoms (eyestrain, headache, and blurred vision), the so-called D > N > O profile (Rebenitsch and Owen, 2016; Stanney and Kennedy, 1997). This symptom profile distinguishes cybersickness from other types of motion sickness, such as simulator sickness and sea sickness. Critically, cybersickness may hinder the immersive and interactive qualities of VR and its real-life applications (Yildirim, 2019).
Although its causes are not entirely clear, cybersickness may be due to a discrepancy between vestibular and visual information about the body’s orientation and self-motion (Gallagher and Ferrè, 2018; Reason, 1978; Reason and Brand, 1975). In typical VR scenarios, such as a VR driving simulator, the simulation provides accurate optic flow patterns of the road, surrounding buildings, people and other parts of the environment, which elicit visual information of self-motion (vection; So et al., 2001). The visual signals tell the brain that the user is moving with a certain speed and in a certain direction. However, since the user is not actually moving, the vestibular organs signal zero angular or translational acceleration. Thus, if the perception of vection is not corroborated by self-motion signals transmitted by the vestibular organs, a sensory conflict is likely to occur, and cybersickness may ensue. Accordingly, higher levels of cybersickness have been described in flight simulators and in VR games in which a high level of sensory conflict is present. However, other accounts of cybersickness have been suggested. For instance, it has been proposed that the provocative stimulus is the postural instability triggered in VR (Riccio and Stoffregen, 1991; Stoffregen et al., 2014). Additionally, the sensory conflict account has been refined by some to focus mainly on mismatches between the estimated and true environmental vestibular vertical (Bos et al., 2008). A common feature of these accounts seems to be that perceptual systems are highly reliable and accurate and VR technology challenges the senses and their integration.
Recent studies have shown promising results in reducing cybersickness by visuo-vestibular sensory re-coupling methods. Cybersickness improves when the absent vestibular information during VR simulated self-motion is mimicked by applying natural or artificial vestibular stimulation. For instance, using a rotational chair and timely coupling of physical movement with VR effectively reduced cybersickness symptoms (Ng et al., 2020). Similarly, cybersickness has been shown to improve with Galvanic Vestibular Stimulation (GVS) to artificially mimic vestibular cues in VR (Gallagher et al., 2020; Weech et al., 2018).
However, research has mainly focused on visuo-vestibular conflicts triggered by VR environments in which vestibular signals are absent, that is, when VR users feel the sensation of travelling through a virtual environment, while actually remaining stationary in the real world. Little is known on visuo-vestibular conflicts in VR scenarios in which the user is allowed to move, such as in 360-degree VR. 360-degree VR refers to videos or photos that are captured using specialist omnidirectional cameras comprised of multiple lenses that enable filming of a full panoramic view of 360-degrees. The user is therefore able to look around the entire scene and perceive an immersive experience. However, in almost all 360-degree VR experiences, movement in the real physical world is not reflected in the virtual world. In other words, current-generation 360-degree VR supports only three-degrees-of-freedom rotational motion where the virtual scene reflects changes in the rotation of the user’s head, but does not support translational motion. Thus, VR lacks the six-degrees-of-freedom self-motion that we normally experience in the real world provided by multisensory integration between dynamic visual and six-degrees-of-freedom vestibular cues, as well as the within-channel integration between angular acceleration cues sensed by the vestibular semicircular canals and translational acceleration sensed by the otolith organs. Perhaps unsurprising, the sensory mismatch in traditional 360-degree VR may easily induce cybersickness symptoms. Many studies have reported various physical symptoms such as headaches, focusing difficulties and dizziness during 360-degree VR viewing (Carnegie and Rhee, 2015; Elwardy et al., 2020; Kim et al., 2019a, 2019b; Padmanaban et al., 2018).
Here, we consider cybersickness in 360-degree head-mounted display VR. We systematically explored the cybersickness profile in settings where there are minimal dynamic visual inputs combined with translational vestibular inputs. We also investigate the effectiveness of a new Artificial Intelligence (AI)-based software in reducing cybersickness. Where previous VR applications have focused on a three-degrees-of-freedom rotational motion, this software aims to reduce cybersickness through providing both rotational and translational motion. We predict that this AI-based software may improve the congruency between visual and vestibular signals during VR exposure and enhance realism. This may have a potential impact on immersive VR experience with fewer unwanted side effects of cybersickness.
2. Methods
2.1. Ethics
The experimental protocol was approved by the ethics committee at Royal Holloway, University of London. The experiment was conducted in line with the Declaration of Helsinki. Written informed consent was obtained prior to commencing the experiment.
2.2. Participants
Twenty-five healthy right-handed participants volunteered in the study (19 women; mean age = 22.16 years, SD = 7.12 years). The sample size was estimated a priori based on similar experimental procedures (Gallagher et al., 2019, 2020), set in advance of testing and was also used as data-collection stopping rule. Participants with a history of neurological, psychiatric, vestibular or auditory disorders were excluded.
2.3. An AI-Based Software to Provide Six-Degree-of-Freedom VR Experience
An AI-based software solution, copernic360 (see Note 1), has been developed by Kagenova (Note 2) to provide six-degrees-of-freedom motion in 360-degree VR by simulating translational motion (McEwen et al., 2020). When using copernic360 the user is able to move within the 360-degree VR experience, e.g., by walking around or leaning forward to inspect the content in the scene, and a synthetic view is generated based on the user’s position in the scene. In this manner full six-degrees-of-freedom motion, including both translational and rotational motion, is supported. This system is designed not only to provide a more realistic and immersive experience but to also reduce the mismatch between visual and vestibular cues when in a 360-degree VR experience.
To achieve this effect, copernic360 includes two main subsystems: a back-end AI-based cloud processing system; and a front-end viewer system in the form of a game engine plugin. The AI-based cloud system processes 360-degree VR video or photo content to recover an estimate of 3D geometry representing the scene. First, AI-based depth estimation techniques are used to estimate a depth map corresponding to the 360-degree content. A residual convolutional neural network (CNN) architecture is adopted, with dilated convolutions (Yu and Koltun, 2016) in order to expand the receptive field for dense predictions, like depth estimation. The architecture is similar to that previously presented by Li and colleagues (Li et al.. 2017), but adapted for spherical 360-degree content. Second, the estimated 360-degree depth information is then used to fit a parametric 3D geometry representing the scene, which is stored as additional metadata associated with the original content. The geometry is fitted by minimising an error cost function. Third, the viewer system uses the metadata computed by the AI-based processing system representing the parametric 3D geometry, along with the original 360-degree video or photo content, to then render a 3D textured representation of the scene. The original 360-degree content is projected onto the 3D geometry to provide a texture for the geometry. Finally, note that for 360-degree video content, the entire process is repeated for each key frame of the video. A key frame represents a frame where the content of the video has changed considerably compared to the prior frame (e.g., due to a cut between scenes or the content of a scene changing over time). Between key frames the 3D geometry is interpolated. In this manner, the user is then able to move about in the reconstructed scene with full six-degrees-of-freedom motion and novel synthetic viewpoints of the scene are then rendered and served to the user depending on their position in the scene.
A custom scenario was adapted for this VR experiment. The scenario consisted of a beach (https://www.atmosphaeres.com/video/445/Rocky+Headland) in which participants could move around for about 10 minutes. Natural sounds were integrated such as the sounds of waves along with an auditory cue played every 20s. The auditory cue signalled participants to provide motion sickness scores and the researcher to record heart rate (HR) variability (see below).
2.4. Experimental Design and Procedure
Verbal and written instructions were given to participants at the beginning of the experiment. Data from each participant were gathered in two experimental sessions. Participants were exposed to the same scene in both standard360-VR (three-degrees-of-freedom VR) and the copernic360-VR (six-degrees-of-freedom VR) conditions. The order of experimental conditions was counterbalanced between participants. In each session, participants were asked to wear an HTC Vive Pro Eye (HTC Corporation, Taoyuan, Taiwan) head-mounted display (HMD) and asked to walk in a square pattern with a 90-degrees turn in direction at each point (Fig. 1A). Participants were trained with the walking pattern before the VR began. Occassionally, an auditory cue instructed participants to look down, explore the VR scene and touch the ground. It was anticipated that this would invoke greater nausea in the conventional three-degrees-of-freedom VR conditions due to the mismatch of visuo-vestibular cues. A webcam was used to record the walking patterns as well as verbal responses.
During VR exposure, participants were asked to perform the Fast Motion Sickness Scale (FMS; Keshavarz and Hecht, 2011) and give a verbal rating of the level of nausea from 0 to 20 (0 = no nausea; 20 = frank sickness). Responses were collected every 20 s when an auditory cue was played before, during and immediately after VR exposure. Participants’ heart rate was also monitored throughout the VR scenario. Heart rate (HR) has been shown to increase with greater levels of sickness in VR (Kim et al., 2005; Nalivaiko et al., 2015). Participants wore a smart watch (Mio ALPHA 2 smart watch, Mio Technology, Taipei, Taiwan) which provided continuous readings of HR, with measurements recorded at the same time as the FMS ratings. Thus, HR was recorded once prior to commencing the VR scenario, every 20s during the scenario, and once immediately following the scenario.
Participants were also asked to complete the Motion Sickness Susceptibility Questionnaire Short form (MSSQ) and a Gaming Experience Questionnaire to ensure that motion sickness susceptibility and previous VR experience were similar across our sample.
2.5. Data Analysis
Supporting data are available as Supplementary Material.
The data were analysed using parametric tests. Parametric tests are a preferred method in repeated-measures randomised designs in which the interest is on the effects of an intervention/treatment vs baseline/control (Vickers, 2005). Importantly, parametric tests have generally more statistical power than nonparametric tests, and are therefore more likely to detect a significant effect when one truly exists.
Paired t-tests were used to analyse differences between the standard360-VR (three-degrees-of-freedom VR) and copernic360-VR (six-degrees-of-freedom VR) conditions. t-Tests were applied on the SSQ scales (SSQ-N, SSQ-D, SSQ-O) and total scores (SSQ-T).
A 2 × 3 repeated-measures ANOVA was used to explore the changes in FMS and HR measures across time. Average FMS values and peak FMS values during VR exposure were entered in to two distinct ANOVAs comparing differences in FMS scores between VR conditions (standard360-VR vs copernic360-VR) across time (pre VR, during VR and post VR). Similarly, average HR values and peak HR values during VR exposure were entered in to two distinct ANOVAs comparing differences in HR between VR conditions (standard360-VR vs copernic360-VR) across time (pre VR, during VR and post VR). Mauchly’s test was used to investigate whether the assumption of sphericity was violated, and accordingly the degrees of freedom were corrected using Greenhouse–Geisser.
3. Results
SSQ scores across VR experimental conditions can be seen in Fig. 1B.
A significant reduction in nausea on the SSQ scale was observed in the copernic360-VR (six-degrees-of-freedom VR) condition compared to standard360-VR (three-degrees-of-freedom VR) (
FMS scores across VR experimental conditions can be seen in Table 1. FMS scores showed no significant effect of VR condition on average FMS rating (
HR values across VR experimental conditions can be seen in Table 2. There was no significant effect of VR condition on average HR measures (
4. Discussion
Despite the improvements in VR hardware, cybersickness is a predominant factor affecting VR experience and feeling of immersion. The mismatch between visuo-vestibular cues can induce compelling feelings of nausea and discomfort. In this study, we focused on a particular VR experience in which videos are captured using omnidirectional cameras which enable the filming of an entire 360-degrees scene. In 360-degree VR the user is able to look and move around the entire scene. However, 360-degree VR only supports three-degrees-of-freedom rotational motion, which might easily conflict with the level of precise information transmitted by the vestibular organs as soon as the user moves their head. That is, traditional 360-degree VR is a unique setting offering minimal dynamic visual inputs combined with a flow of self-motion vestibular inputs. Here we investigated whether cybersickness in 360-degree VR can be reduced by a novel AI-based software (copernic360). Both explicit (SSQ and FMS) and implicit (HR) measures were used to quantify the changes in cybersickness. Our results showed a significant reduction in nausea symptoms on the SSQ (Kennedy et al., 1993) when using copernic360-VR compared to traditional 360-degree VR. Importantly, the oculomotor and disorientation dimensions were not significantly different between VR conditions. However, the significant reduction in nausea symptoms demonstrates the potential of the copernic360 software to improve the VR experience. As anticipated, we observed an increase in motion sickness ratings and HR towards the end of VR exposure. While there were no significant differences between the copernic360-VR and traditional 360-degree VR conditions for these measures, the increase appears more marginal in copernic360-VR.
Cybersickness has been often assessed via subjective self-reports. The SSQ (Kennedy et al., 1993) is probably the most frequently used. This questionnaire breaks down motion sickness symptoms into three main categories: disorientation (including dizziness, vertigo and difficulty focusing), oculomotor (eyestrain, headache, and blurred vision) and nausea (stomach awareness, increased salivation, and nausea itself). In typical VR scenarios in which the user is experiencing vection while vestibular cues are signalling zero angular or translational acceleration, cybersickness is characterised by a D > N > O profile in which the most severe and frequent symptoms are observed in the disorientation domain, followed by nausea symptoms, and thirdly oculomotor symptoms (Rebenitsch and Owen, 2016; Stanney et al., 2003). This pattern of symptoms has been explained by a sensory mismatch between visual and vestibular cues in VR which may lead to difficulties in forming and updating spatial maps of the external virtual environment and the organism (Lackner and DiZio, 2005), leaving the VR user disoriented. Similar to previous studies, here we observed an overall D > N > O profile across experimental conditions (standard360-VR and copernic360-VR). Although copernic360-VR numerically reduced sickness scores of disorientation, nausea and oculomotor symptoms, it only significantly improves the nauseogenic aspect of the 360-degree VR experience. Importantly, nausea is consistently experienced in 360-degree settings (Gavgani et al., 2017) and it has been associated with significant user drop-out (Balk et al., 2013; Ehrlich and Kolasinski, 1998). The congruency between visual and vestibular signals induced by the copernic360-VR technology allowed for a reduction in self-reported nausea in the current study. Reducing this key symptom allows for a more immersive and interactive virtual experience. This new technology may offer an alternative to physical setups, artificial stimulation and sensory habituation protocols (Ng et al., 2020).
Importantly, the scores obtained in the SSQ (Kennedy et al., 1993) for the standard 360-degree VR condition are comparable to those in previous studies. Nausea levels have been reported ranging from 14.91 to 30.21, oculomotor symptoms varied from 15.30 to 25.74, and disorientation was reported markedly higher from 21.75 to 41.47 (Ehrlich and Kolasinski, 1998; Kolasinski and Gilson, 1998; Stanney and Kennedy, 1998). The current VR experience was characterised by a similar level of nauseogenicity. However, several factors influence cybersickness symptoms, and direct comparison between studies might not be straightforward. Not surprisingly, people susceptible to motion sickness are more likely to experience cybersickness (Rebenitsch and Owen, 2014). Symptom severity and incidence increase with age (Arns et al., 2005). Some studies have indicated that females are more susceptible to cybersickness than males (Curry et al., 2020; Kim et al., 2020). Further, the technical characteristics of the VR equipment such as the weight of the headset as well as the VR environment might play a key role.
We measured cybersickness FMS self-reports and HR variability during VR exposure. The FMS provides continuous data and measures of sickness during the presentation of a sickness-inducing situation, not necessarily cybersickness (Keshavarz and Hecht, 2011). As expected, we observed an increase in cybersickness scores over the ten minutes of VR exposure. However, the scores were not significantly influenced by the copernic360-VR technology. It is important to note that FMS scores were overall very low and several participants never rated above zero. Our VR scenario consisted of a realistic beach scene accompanied by the soft sound of waves. While a multimodal exposure to visual and audio cues in VR may have increased a sense of presence typically leading to more sickness discomfort (Cooper et al., 2018; Kim et al., 2014), the calming content of our scenario may have masked feelings of cybersickness (Amores et al., 2018). Accordingly, emotional and physical discomfort have been related to the SSQ measures (Somrak et al., 2019). This might suggest that the VR scenario developed for this experiment was not particularly sickness-inducing, or more in general 360-degree VR is not as challenging as traditional VR. That is, the content of the VR may have reduced or counteracted feelings of cybersickness. Further, the VR exposure is markedly less disorientating when compared to rollercoaster or first-person shooter scenarios. This may also explain the lack of significance found on the SSQ-D. Lastly, the duration of VR exposure could be extended. Increased duration has been associated with cybersickness (Moss and Muth, 2011) with 10–20 mins of exposure leading to most compelling symptoms of cybersickness (Häkkinen et al., 2019). Future studies should consider using a more conflicting VR scenario involving longer duration and dynamic movements. For instance, one could use a motion platform to dynamically change the tilt of the floor paired with a 360-degree video from aboard a ship. A longer duration with greater mismatch might allow the benefits of six-degrees-of-freedom VR to be tested more thoroughly.
While some studies have reported an increased HR in VR (Nalivaiko et al., 2015), others report conflicting findings. For instance, when participants were exposed to VR across three days, there were no significant differences found in HR measures (Gavgani et al., 2017). This might suggest a form of habituation when participants are repeatedly exposed to VR over a period of time, which may have happened in our experiment. An effect of copernic360-VR on HR measure of sickness during VR could potentially have been masked by the content of the VR scenario (Guna et al., 2020). We cannot exclude that participants may have also adapted to the walking pattern, perhaps influencing the saliency of vestibular cues during the VR which might have in turn reduced sensory conflict and cybersickness. Along with diversifying the walking pattern to reduce the effects of habituation, more interaction with the virtual environment could be encouraged. Increased interaction through grasping movement or changes in head movement may induce greater mismatch between visual and vestibular cues leading to cybersickness. Thus, future studies should consider integrating head movements to increase the disorientation effects of VR and the benefits of six-degrees-of-freedom VR be compared to three-degrees-of-freedom VR.
When considering the benefits of copernic360-VR, only self-reported nausea was found to differ statistically. Nausea symptoms are one of the most challenging aspects of cybersickness. Importantly, our results showed that copernic360 significantly reduced the feeling of nausea. No differences were found in the other sickness domains or in physiological measures. Using a more conflicting VR scenario for an extended period of time is recommended. As far as we are aware, no other approaches have implemented an AI-based six-degrees-of-freedom technology to reduce cybersickness. Here, we offer new insights into its benefits and encourage future studies to explore the components of three-degrees-of-freedom VR vs six-degrees-of-freedom VR experiences.
As VR continues to improve, more needs to be done to combat the unwanted effects of cybersickness. This study demonstrated the potential of a novel AI-based software to increase the congruency between visual and vestibular cues. Crucially, feelings of self-reported nausea were reduced, implying weakened feelings of cybersickness. Reducing the nausea and discomfort perceived by individuals is crucial in allowing the application and further utility of VR in educational, medicinal, cultural and entertainment settings.
To whom correspondence should be addressed. E-mail: elisa.ferre@bbk.ac.uk
Acknowledgements
This work was supported by a UKRI StoryFutures R&D on Demand awarded to ERF.
Competing Interests
The authors declare that they had no conflicts of interest with respect to their authorship or the publication of this article.
Author Contributions
IA and ERF designed the experiment and wrote the manuscript. PDM, ME and JDM designed the VR technology and implemented the VR scenario. IA collected the data. JDM reviewed the manuscript, giving a critical revision of it. All authors gave their final approval to the version to be published.
Supplementary Material
Supplementary material is available online at: https://doi.org/10.6084/m9.figshare.17111417
References
Amores, J., Richer, R., Zhao, N., Maes, P. and Eskofier, B. M. (2018). Promoting relaxation using virtual reality, olfactory interfaces and wearable EEG, in: 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pp. 98–101. DOI:10.1109/BSN.2018.8329668.
Arns, L. L. and Cerney, M. M. (2005). The relationship between age and incidence of cybersickness among immersive environment users, in: IEEE Proceedings Virtual Reality, 2005, pp. 267–268. DOI:10.1109/VR.2005.1492788.
Balk, S. A., Bertola, M. A. and Inman, V. W. (2013). Simulator sickness questionnaire: twenty years later, in: Proceedings of the Seventh International Driving Symposium on Human Factors in Driver Assessment, pp. 257–263. https://doi.org/10.17077/drivingassessment.1498.
Bos, J. E., Bles, W. and Groen, E. L. (2008). A theory on visually induced motion sickness, Displays 29, 47–57. https://doi.org/10.1016/j.displa.2007.09.002.
Butler, J. S., Smith, S. T., Campos, J. L. and Bülthoff, H. H. (2010). Bayesian integration of visual and vestibular signals for heading, J. Vis. 10, 23. https://doi.org/10.1167/10.11.23.
Carnegie, K. and Rhee, T. (2015). Reducing visual discomfort with HMDs using dynamic depth of field, IEEE Comput. Graph. Appl. 35, 34–41. https://doi.org/10.1109/MCG.2015.98.
Chen, A., Gu, Y., Takahashi, K., Angelaki, D. E. and DeAngelis, G. C. (2008). Clustering of self-motion selectivity and visual response properties in macaque area MSTd, J. Neurophysiol. 100, 2669–2683. https://doi.org/10.1152/jn.90705.2008.
Cooper, N., Milella, F., Pinto, C., Cant, I., White, M. and Meyer, G. (2018). The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment, PLoS ONE 13, e0191846. https://doi.org/10.1371/journal.pone.0191846.
Curry, C., Li, R., Peterson, N. and Stoffregen, T. A. (2020). Cybersickness in virtual reality head-mounted displays: examining the influence of sex differences and vehicle control, Int. J. Hum. Comput. Interact. 36, 1161–1167. https://doi.org/10.1080/10447318.2020.1726108.
Ehrlich, J. A. and Kolasinski, E. M. (1998). A comparison of sickness symptoms between dropout and finishing participants in virtual environment studies, Proc. Hum. Factors Ergonom. Soc. 2, 1466–1470. https://doi.org/10.1177/154193129804202101.
Elwardy, M., Zepernick, H.-J., Hu, Y., Chu, T. M. C. and Sundstedt, V. (2020). Evaluation of simulator sickness for 360° videos on an HMD subject to participants’ experience with virtual reality, in: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VRW), pp. 477–484. https://doi.org/10.1109/VRW50115.2020.00100.
Fetsch, C. R., Turner, A. H., DeAngelis, G. C. and Angelaki, D. E. (2009). Dynamic reweighting of visual and vestibular cues during self-motion perception, J. Neurosci. 29, 15601–15612. https://doi.org/10.1523/JNEUROSCI.2574-09.2009.
Gallagher, M. and Ferrè, E. R. (2018). Cybersickness: a multisensory integration perspective, Multisens. Res. 31, 645–674. https://doi.org/10.1163/22134808-20181293.
Gallagher, M., Dowsett, R. and Ferrè, E. R. (2019). Vection in virtual reality modulates vestibular-evoked myogenic potentials, Eur. J. Neurosci. 50, 3557–3565. DOI:10.1111/ejn.14499.
Gallagher, M., Choi, R. and Ferrè, E. R. (2020). Multisensory interactions in virtual reality: optic flow reduces vestibular sensitivity, but only for congruent planes of motion, Multisens. Res. 33, 625–644. DOI:10.1163/22134808-20201487.
Gavgani, A. M., Nesbitt, K. V., Blackmore, K. L. and Nalivaiko, E. (2017). Profiling subjective symptoms and autonomic changes associated with cybersickness, Auton. Neurosci. 203, 41–50. https://doi.org/10.1016/j.autneu.2016.12.004.
Greenlee, M. W., Frank, S. M., Kaliuzhna, M., Blanke, O., Bremmer, F., Churan, J., Cuturi, L. F., MacNeilage, P. R. and Smith, A. T. (2016). Multisensory integration in self motion perception, Multisens. Res. 29, 525–556. https://doi.org/10.1163/22134808-00002527.
Guna, J., Geršak, G., Humar, I., Krebl, M., Orel, M., Lu, H. and Pogačnik, M. (2020). Virtual reality sickness and challenges behind different technology and content settings, Mob. Netw. Appl. 25, 1436–1445. https://doi.org/10.1007/s11036-019-01373-w.
Häkkinen, J., Ohta, F. and Kawai, T. (2019). Time course of sickness symptoms with HMD viewing of 360-degree videos, J. Imaging Sci. Technol. 62, 60403-1–60403-11. https://doi.org/10.2352/J.ImagingSci.Technol.2018.62.6.060403.
Kennedy, R. S., Lane, N. E., Berbaum, K. S. and Lilienthal, M. G. (1993). Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness, Int. J. Aviat. Psychol. 3, 203–220. https://doi.org/10.1207/s15327108ijap0303_3.
Keshavarz, B. and Hecht, H. (2011). Validating an efficient method to quantify motion sickness, Hum. Factors 53, 415–426. https://doi.org/10.1177/0018720811403736.
Kim, H. G., Lim, H.-T., Lee, S. and Ro, Y. M. (2019b). VRSA net: VR sickness assessment considering exceptional motion for 360° VR video, IEEE Trans. Image Process. 28, 1646–1660. https://doi.org/10.1109/TIP.2018.2880509.
Kim, J., Luu, W. and Palmisano, S. (2020). Multisensory integration and the experience of scene instability, presence and cybersickness in virtual environments, Comput. Hum. Behav. 113, 106484. https://doi.org/10.1016/j.chb.2020.106484.
Kim, K., Rosenthal, M. Z., Zielinski, D. J. and Brady, R. (2014). Effects of virtual environment platforms on emotional responses, Comput. Methods Programs Biomed. 113, 882–893. https://doi.org/10.1016/j.cmpb.2013.12.024.
Kim, K., Lee, S., Kim, H. G., Park, M. and Ro, Y. M. (2019a). Deep objective assessment model based on spatio-temporal perception of 360-degree video for VR sickness prediction, in: Proc. int. Conf. Image Process. ICIP, pp. 3192–3196. https://doi.org/10.1109/ICIP.2019.8803257.
Kim, Y. Y., Kim, H. J., Kim, E. N., Ko, H. D. and Kim, H. T. (2005). Characteristic changes in the physiological components of cybersickness, Psychophysiology 42, 616–625. https://doi.org/10.1111/j.1469-8986.2005.00349.x.
Kolasinski, E. M. and Gilson, R. D. (1998). Simulator sickness and related findings in a virtual environment, Proc. Hum. Factors Ergonom. Soc. Annu. Meet. 42, 1511–1515. https://doi.org/10.1177/154193129804202110.
Lackner, J. R. and DiZio, P. (2005). Vestibular, proprioceptive, and haptic contributions to spatial orientation, Annu. Rev. Psychol. 56, 115–147. https://doi.org/10.1146/annurev.psych.55.090902.142023.
LaViola, J. J. (2000). A discussion of cybersickness in virtual environments, ACM SIGCHI Bull. 32, 47–56. https://doi.org/10.1145/333329.333344.
Li, B., Dai, Y., Chen, H. and He, M. (2017). Single image depth estimation by dilated deep residual convolutional neural network and soft-weight-sum inference, .
McEwen, J. D., Wallis, C. G. R., Ender, M. and d’Avezac, M. (2020). Method and system for providing at least a portion of content having six degrees of freedom motion, UK Patent GB2575932.
Moss, J. D. and Muth, E. R. (2011). Characteristics of head-mounted displays and their effects on simulator sickness, Hum. Factors 53, 308–319. https://doi.org/10.1177/0018720811405196.
Nalivaiko, E., Davis, S. L., Blackmore, K. L., Vakulin, A. and Nesbitt, K. V. (2015). Cybersickness provoked by head-mounted display affects cutaneous vascular tone, heart rate and reaction time, Physiol. Behav. 151, 583–590. https://doi.org/10.1016/j.physbeh.2015.08.043.
Ng, A. K. T., Chan, L. K. Y. and Lau, H. Y. K. (2020). A study of cybersickness and sensory conflict theory using a motion-coupled virtual reality system, Displays 61, 101922. https://doi.org/10.1016/j.displa.2019.08.004.
Padmanaban, N., Ruban, T., Sitzmann, V., Norcia, A. M. and Wetzstein, G. (2018). Towards a machine-learning approach for sickness prediction in 360° stereoscopic videos, IEEE Trans. Vis. Comput. Graph. 24, 1594–1603. DOI:10.1109/TVCG.2018.2793560.
Reason, J. T. (1978). Motion sickness adaptation: a neural mismatch model, J. R. Soc. Med. 71, 819–829.
Reason, J. T. and Brand, J. J. (1975). Motion Sickness. Academic Press, London, UK.
Rebenitsch, L. and Owen, C. (2014). Individual variation in susceptibility to cybersickness, in: UIST’14: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, pp. 309–317. https://doi.org/10.1145/2642918.2647394.
Rebenitsch, L. and Owen, C. (2016). Review on cybersickness in applications and visual displays, Virt. Real. 20, 101–125. https://doi.org/10.1007/s10055-016-0285-9.
Riccio, G. E. and Stoffregen, T. (1991). An ecological theory of motion sickness and postural instability, Ecol. Psychol. 3, 195–240. https://doi.org/10.1207/s15326969eco0303_2.
So, R. H. Y., Ho, A. and Lo, W. T. (2001). A metric to quantify virtual scene movement for the study of cybersickness: definition, implementation, and verification, Presence (Cambridge, Mass.) 10, 193–215. https://doi.org/10.1162/105474601750216803.
Somrak, A., Humar, I., Hossain, M. S., Alhamid, M. F., Hossain, M. A. and Guna, J. (2019). Estimating VR sickness and user experience using different HMD technologies: an evaluation study, Future Gener. Comput. Syst. 94, 302–316. https://doi.org/10.1016/j.future.2018.11.041.
Stanney, K. M. and Kennedy, R. S. (1997). The psychometrics of cybersickness, Commun. ACM 40, 66–68. https://doi.org/10.1145/257874.257889.
Stanney, K. M. and Kennedy, R. S. (1998). Aftereffects from virtual environment exposure: how long do they last?, Proc. Hum. Factors Ergonom. Soc. Ann. Meet. 42, 1476–1480. https://doi.org/10.1177/154193129804202103.
Stanney, K. M., Hale, K. S., Nahmens, I. and Kennedy, R. S. (2003). What to expect from immersive virtual environment exposure: influences of gender, body mass index, and past experience, Hum. Factors 45, 504–520. https://doi.org/10.1518/hfes.45.3.504.27254.
Stoffregen, T., Chen, Y.-C. and Koslucher, F. C. (2014). Motion control, motion sickness, and the postural dynamics of mobile devices, Exp. Brain Res. 232, 1389–1397. https://doi.org/10.1007/s00221-014-3859-3.
Vickers, A. J. (2005). Parametric versus non-parametric statistics in the analysis of randomized trials with non-normally distributed data, BMC Med. Res. Methodol. 5, 35. https://doi.org/10.1186/1471-2288-5-35.
Weech, S., Moon, J. and Troje, N. F. (2018). Influence of bone-conducted vibration on simulator sickness in virtual reality, PLoS ONE 13, e0194137. https://doi.org/10.1371/journal.pone.0194137.
Yildirim, C. (2019). Cybersickness during VR gaming undermines game enjoyment: a mediation model, Displays 59, 35–43. https://doi.org/10.1016/j.displa.2019.07.002.
Yu, F. and Koltun, V. (2016). Multi-scale context aggregation by dilated convolutions, in: 4th int. Conf. Learn. Represent. ICLR.