Save

Influence of Tactile Flow on Visual Heading Perception

In: Multisensory Research
View More View Less
  • 1 Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany
  • | 2 Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
Open Access

Abstract

The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual–vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer’s self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.

Abstract

The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual–vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer’s self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.

1. Introduction

The integration of information from different sensory modalities is crucial for successful navigation through an environment including the estimation of self-motion parameters like traveled distance (path integration) or direction of self-motion (heading). Self-motion generates distinct optic flow patterns on the retina (Lappe and Rauschecker, 1993, 1994; Warren and Hannon, 1988) which are linked to the direction of motion in a complex way (Bremmer et al., 2017; Lappe et al., 1999; Matthis et al., 2021). Vestibular (e.g., Rodriguez and Crane, 2020), but also tactile (Churan et al., 2017; Harris et al., 2017) and auditory (von Hopffgarten and Bremmer, 2011) signals have also been shown to contribute to the perception of self-motion.

The interplay of signals from different sensory modalities in heading perception is typically investigated by combining visual optic flow stimuli with cues from other sources. While probing combined visual–vestibular self-motion processing has been subject of a large body of literature (e.g., Angelaki, 2014; Harris et al., 2000; Hummel et al., 2016; for a review see e.g., Fetsch et al., 2012), investigation of the interaction of visual and tactile information in heading perception has received comparatively little attention. Seno and colleagues (2011) demonstrated that an air flow toward the face of observers facilitates the impression of self-motion (vection) when presented concurrently with optic flow simulating forward self-motion. Air flow combined with visually simulated backward self-motion did not facilitate vection, nor did airflow alone. In our own previous work, we could show that adding congruent tactile air flow (across participants’ foreheads) to visual optical flow significantly improved the precision of reproduced traveled distances (Churan et al., 2017), even though stimuli were not integrated in a statistically optimal fashion in terms of a Bayesian framework. Also, tactile flow across the fingers can induce a strong sensation of self-motion. Harris and colleagues (2017) presented their subjects with real oscillatory sideways self-motion (vestibular stimulation) and tactile flow across the participants’ fingers, which could be presented either congruently or incongruently by varying the phase and speed of the tactile stimulus relative to the visual stimulus. Remarkably, results provided clear evidence for tactile flow to dominate perceived self-motion. In these previous studies, the tactile stimulus, when presented congruently with visual or vestibular self-motion stimuli, allowed to increase the immersiveness of self-motion, i.e., vection. Here, on the contrary, we aimed to determine if visually perceived heading is affected by a behaviorally irrelevant tactile flow stimulus. Visually, we simulated self-motion across a ground plane in various directions. Tactile flow was delivered by nozzles around the participant’s head. Subjects had to report the visual heading direction. A potential influence of the tactile flow on visual heading direction was examined by varying the angle between the visually simulated self-motion direction and the tactile flow.

2. Methods

2.1. Subjects

Ten subjects participated in this study (seven male; mean age = 26.3 years, ranging from 23 to 31 years), all with normal or corrected-to-normal vision. All subjects provided written informed consent prior to the start of an experiment and remained naive to the purpose of the study during the experiment but were offered disclosure thereafter. Testing sessions took place on two separate days. Testing on each day lasted approx. 2.5 h. Subjects were compensated for their participation (8 euros per hour). The experiment was approved by the local ethics committee and conformed to the Declaration of Helsinki.

2.2. Apparatus

Visual stimuli were designed with MATLAB R2019a (MathWorks, Natick, MA, USA) and the psychophysics toolbox (Brainard, 1997; Kleiner et al., 2007) running on a Windows PC (XP 32 Bit, Dell Technologies, Round Rock, TX, USA). Visual stimuli were back-projected on a transparent screen by a video projector (Christie M 4.1 DS+6K-M SXGA, Christie Digital Systems, Inc., Cypress, CA, USA), at a frame rate of 120 Hz and a resolution of 1152 × 864 pixels at 70 cm viewing distance, thereby covering the central 81 × 66° of the visual field. Subjects, with their head stabilized by a chin rest, gave their response via mouse click. During stimulus presentation, they had to fixate a central target. We recorded eye position monocularly with an EyeLink 1000 (SR Research, Ottawa, ON, Canada). Trials in which a blink occurred were discarded. This also applied to trials in which deviation of the eye position exceeded a ±3 × ±3° control window centered around the fixation target. These trials were aborted and repeated later during the course of the experiment.

2.3. Visual Stimulus

Self-motion was simulated as forward movement with a constant speed of 7 m/s across a virtual 2D ground plane of max. 2000 white (luminance 100 cd/m2) random dots on a black (luminance < 0.1 cd/m2) background. The size and position of each dot changed regarding their position relative to the observer with each frame. Coherence of dot motion was 100%. The maximum lifetime of each dot was limited to prevent subjects from orienting themselves on a single dot’s trajectory. After a maximum of 250 ms, each single dot disappeared and re-appeared at a new random location.

Figure 1.
Figure 1.

Experimental setup. Seven head-mounted nozzles provided air flow with a static pressure of 1 bar. The apparatus was controlled by a data acquisition box and mounted on top of soundproof earmuffs.

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

2.4. Tactile Stimulus

The tactile setup is shown in Fig. 1. Airflow from one of seven nozzles simulated the tactile component of self-motion. All nozzles were arranged with a distance of 3 cm to the subject’s forehead and with 1.7 cm distance between them, which corresponds to an angular distance of 8 degrees. Precise positioning of the nozzles was ensured by a solid plastic spacer enclosing the nozzles tips. Small grids were mounted in front of the nozzle tips to slightly expand the air flow and by this, generating a more natural feeling of a wind breeze resulting, e.g., from self-motion. Nozzles were installed on a plastic panel which was fixed centrally on top at the headband of soundproof earmuffs (3M™ 1436 Ear Defender, 3M, Maplewood, MN, USA). The plastic panel also carried one supply hose and seven solenoid valves (AMV-MNS-24-01, BMT, Frankfurt, Germany). The weight of the apparatus (1350 g) on the subject’s head was sprung by a foam cushion which was placed under the headband of the soundproof earmuffs. Air flow with a speed of approx. 6.2 m/s was provided by a gas cylinder with a dynamic pressure of approx. 1 bar. Solenoid valves for opening and closing the air supply were controlled by a data acquisition box (DAQ Type: USB-1208FS, Measurement Computing Corporation, Norton, MA, USA) that was connected to the stimulus computer via an USB port and accessed via MATLAB. Calibration of the visuo-tactile stimulus system guaranteed to present the onset and offset of visual and tactile stimuli within approx. 7 ms. Subjects wore protective goggles to protect their eyes from the air flow. To prevent distraction from the noise caused by the air flow and to ensure that participants could not identify the tactile stimulus direction by the nozzles’ sounds, subjects wore in-ear plugs. Additionally, pink noise was delivered via over-ear headphones (RPHS46EK, Panasonic, Kadoma, Osaka, Japan) which subjects wore under the soundproof earmuffs. Pink noise (95 dB SPL) was delivered only during stimulus presentation, thereby covering the noise produced by the opening of the valves and the air stream (72 dB SPL).

2.5. Procedure

Testing on each of the two days lasted approx. 2.5 h. On day one, subjects completed either the visual-only or the tactile-only discrimination task to determine the discrimination thresholds for unimodal self-motion stimuli. This task was followed by five (of ten) blocks of the bimodal heading perception task, with two blocks of trials each probing visual-only or tactile-only heading perception randomly interspersed. On day two, subjects completed first the discrimination task in the remaining sensory modality and solved the remaining unimodal and bimodal heading perception tasks. In a final experiment on day two, subjects had to rate perceived vection (i.e., vividness of perceived self-motion, see below for details) in the unimodal and bimodal sensory conditions.

2.5.1. Heading Discrimination

The subject’s ability to discriminate forward headings was tested in both modalities, vision and somatosensation, separately. Subjects always compared a test heading direction (visual: 0°, ±2°, ±4°, ±6°, ±8°; tactile: 0°, ±8°, ±16°) to the standard heading of straight ahead (0°) with the standard heading always being presented first, followed by the test heading. Both stimuli were presented for 250 ms, with ten trials for each test direction and directions presented in pseudorandomized order. After presentation of the test stimulus, subjects had to indicate if it was perceived to the left or to the right with respect to the standard stimulus by pressing one of two arrow keys on the keyboard. To avoid possible interactions between the measurements in the visual and the tactile modality, the two measurements were performed on different days and the order was balanced across subjects.

2.5.2. Heading Perception

In the visual modality, self-motion was simulated in nine possible directions, ranging from −16° to +16° in four-degree steps. Here, 0° means straight-forward self-motion and negative/positive values indicate directions forward and to the left/right. In the tactile modality, forward motion was simulated in seven possible directions ranging from −24° to 24° in eight-degree steps. In the bimodal task, participants were presented with nine possible visual headings to prevent them from anticipating a specific range of headings and consequently adjusting their responses to a limited range of headings. Tactile headings have been chosen to cover a wide range of offsets relative to visual headings with offsets to both sides of a given heading direction. The combination of nine visual and seven tactile headings resulted in sixty-three visuo-tactile stimulus conditions. Each trial simulated self-motion for 500 ms and 16 trials were performed per condition, resulting in a total of (9+7+63)×16=1264 trials per subject (for sequence of trials, see below). After stimulus presentation, a movable green dot (7 × 7 pixels) was presented on the screen and the subject’s task was to estimate perceived heading by placing this dot with the mouse pointer at the appropriate position on a continuous line, displayed horizontally across the screen (‘horizon’). For each trial, the green dot was initially located at a random position on the horizontal line. Response time was not restricted. In the visual and the bimodal blocks of trials, the participant’s task was to indicate the ‘visually perceived’ heading, in the tactile-only blocks of trials it was the ‘perceived’ heading direction. Participants fixated a red fixation point located centrally on the screen during all parts of the experiment. If fixation was successfully detected, the trial started automatically. Figure 2 shows the sequence of a unimodal visual trial.

Figure 2.
Figure 2.

Sequence of a visual-only trial in the heading perception task. A static dot pattern was presented until stable fixation of the central fixation point was achieved. Then, dots started moving for 500 ms, thereby simulating a forward self-motion in one of nine directions. White arrows in the second panel indicate the moving direction of the dots (simulating straight-ahead heading). Then the dots vanished from the screen and participants indicated perceived heading by placing a dot (green) with the mouse pointer at the appropriate position.

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

2.5.3. Vection

To quantify the immersiveness of self-motion (vection) as induced by the visual, tactile, or bimodal stimulation, subjects evaluated perceived vection on a 1–10 Likert scale (Weech et al., 2018) where a value of 1 represents ‘no impression of self-motion’ and a value of 10 represents ‘very strong impression of self-motion’. In this experiment, each trial presented visual-only or tactile-only stimuli simulating straight ahead (0°) self-motion for 3 s. In the bimodal condition, visually simulated self-motion directed straight ahead was paired in a given trial with one of seven directions of the tactile flow (ranging from −24° to +24°, in steps of 8°). Trials of each condition were presented twice in randomized order (unimodal visual: 2, unimodal tactile: 2, bimodal: 2×7=14 trials). Eight of ten subjects completed this task.

2.6. Data Processing

For all analyses, a p value of 0.05 or smaller indicated statistical significance. For repeated measurements, we calculated analyses of variance (ANOVA). Greenhouse–Geisser correction was applied to p values in case of violated sphericity assumption (Mauchly test p<0.05). Effect sizes will be reported by η2.

2.6.1. Fitting of Psychometric Functions

Cumulative Gaussian functions were fitted to subjects’ responses in the heading discrimination task using the psignifit 3.0 toolbox (Fründ et al., 2011). The cumulative Gaussian functions were used to derive the point of subjective equality as the point at which the heading was selected as being rightward from straight ahead 50% of the time. Measures of the just noticeable difference (JND) were obtained from standard deviations of the fitted Gaussian functions.

2.6.2. Influence of Tactile Flow on Visual Heading Perception

In a final step, we aimed to determine a potential modulatory influence of the task-irrelevant tactile flow on visual heading perception. To this end, the condition in which visual and tactile were directed in the same direction (congruent) served as reference. We did not expect any differences between the effects of tactile flow to the left or to the right of the visually simulated (incongruent conditions). Therefore, we determined the effect of the absolute angular separation α between visual and tactile flow. This approach is illustrated in Fig. 3. Here, in the experimental conditions A, B and C, the angular separation α between visual and tactile flow was always the same. Accordingly, results of visual heading perception in all three conditions would be averaged (D). Importantly, this approach allowed us to interpret the modulatory effect of tactile flow as being directed toward or away from it.

Figure 3.
Figure 3.

Effect of the absolute angular separation between visual and tactile flow in bimodal trials. A, B and C show examples of possible offsets between visual (blue arrow) and tactile (red arrow) flow in the bimodal condition. In A and B, visual heading is straight ahead (as if seen from above), while tactile flow is 16° (alpha) to the left or right of the visual heading. In C, tactile flow is also 16° to the right of visual heading, which itself, however, is 8° to the right. Importantly, the absolute angular difference between tactile and visual flow is always 16°. We did not expect any differences between the effects of tactile flow to the left or to the right of the visually simulated heading. Accordingly, in our data analysis, visual heading performance was averaged over identical angular differences alpha, regardless of the sign of alpha, i.e., whether tactile flow was to the left or right of the visual flow. This approach allowed us to quantify effects of tactile stimulation on visual heading perception as being directed away from or towards tactile flow (D).

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

Figure 4.
Figure 4.

Boxplots for experienced vection of self-motion in the visual-only, tactile-only and bimodal conditions. Perceived vection was rated on a Likert scale (1 = ‘no impression of self-motion’, 10 = ‘strong impression of vection’) (ordinate). (A) The central, black dots indicate the median and the bottom and top edges of the boxes the 25th and 75th percentiles, respectively. The most extreme data points are indicated by the vertical lines. In the bimodal condition, the offset between visual (always straight ahead) and tactile self-motion direction is indicated under the plots. (B) Experienced vection ratings of each subject are compared between congruent (0/0) and incongruent (0/−24 and 0/24) conditions (left and middle panel) or between both incongruent conditions (right panel). Values were significantly different between the congruent (0/0) and both incongruent conditions (0/−24 and 0/24), but not different between the two incongruent conditions.

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

3. Results

3.1. Vection

To investigate how the information from the two modalities (and their spatial congruency) contributes to the vividness of self-motion perception (vection), subjects judged their vection on a Likert scale (1 = ‘no impression of self-motion’, 10 = ‘strong impression of self-motion’). Figure 4 shows the mean ratings across eight subjects. A Friedman test for repeated measurements revealed a significant difference between conditions (p<0.001, χ2=46.459). Stimuli in the bimodal condition induced a stronger sensation of self-motion compared to unimodal conditions. Across bimodal heading stimuli, a coherent presentation of the visual and the tactile stimulus (both straight ahead) produced the strongest impression of self-motion. In comparison, experienced vection was rated lowest for the conditions with the most peripheral tactile flow (0/−24, 0/24) [one-way ANOVA, FGG(1.56)=9.9, p<0.01, ηGG2=0.593]. Experienced vection between tactile heading offsets of 24° to the left and 24° to the right from visual heading did not differ significantly [paired t-test, t(7)=0.761, p=0.47].

3.2. Unimodal Heading Discrimination

In a first step, we collected behavioral data from ten subjects performing a heading discrimination task. In the following, figures show averaged data across participants, unless stated otherwise. Subjects compared a test stimulus (visual: 0°, ±2°, ±4°, ±6°, ±8°; tactile: 0°, ±8°, ±16° headings) to a standard stimulus (0° heading, i.e., straight ahead) by means of a two-alternative forced choice task (2AFC). Psychometric functions in Fig. 5 illustrate the proportion of rightward answers (positive values indicate rightward, negative values leftward answers) obtained in the visual (blue curve) and tactile condition (red curve), respectively.

Figure 5.
Figure 5.

Psychometric functions for straight-ahead (0°) self-motion in unisensory conditions. Blue and red symbols depict data from the visual-only and tactile-only condition, respectively. Solid lines show the cumulative gaussian fits. Test stimuli (tactile-only: 0°, ±8°, ±16° and visual-only: ±2°, ±4°, ±6°, ±8°) were compared to straight ahead (0°).

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

Point of subjective equality values in the unimodal conditions were both close to zero and not significantly different (visual: M=0.05°; tactile: M=0.36; paired samples t-test, p=0.37, T=0.94). JNDs were 2.1° (visual) and 2.8° (tactile) and did not differ significantly either (paired samples t-test, p=0.15, T=1.59). While the exact numerical values as derived from the tactile domain must be considered with some care (see section 4. Discussion), overall, perception was similar for visual and tactile stimuli.

Figure 6.
Figure 6.

Data and fits for perceived as a function of presented heading direction (°) for identical visual and tactile headings. Colored lines indicate a linear regression fit for the visual-only (blue line) and tactile-only (red line) condition for identical headings between both modalities (0°, ±8°, ±16°). The oblique dashed line (identity) represents veridical perception. Error bars represent standard errors over participants.

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

3.3. Unimodal Heading Perception

In a second step, we probed unimodal heading perception. Here, subjects reported perceived heading direction after being provided with visual (optic flow) or tactile (air flow) stimuli in seperate blocks of trials. Figure 6 shows perceived heading (ordinate) as a function of real heading (abscissa) for both modalities and corresponding linear regressions (y=mx+b, where y is perceived heading; x, real heading; m, slope; and b, intercept or accuracy) fitted to the data. Here, we only considered headings which could be presented experimentally in both sensory modalities (0°, ±8° and ±16°). Intercept values were both close to 0 (visual: bvisual=0.06; tactile: btactile=0.37) and a paired sample t-test indicated no statistically significant difference (p=0.24, T=1.25). Slope coefficients m differed significantly between conditions (paired samples t-test, p<0.01, T=4.01) with larger slope coefficients for the visual (mvisual=0.69) as compared to the tactile condition (mtactile=0.32), thus suggesting a significant difference in the centripetal bias between conditions, being stronger for tactile as compared to visual heading.

3.4. Bimodal Heading Perception

In bimodal trials, subjects were presented a tactile (air flow) and a visual stimulus (optic flow) simultaneously, either simulating the same heading (congruent condition) or with an offset angle interposed inbetween (ranging from 4° to 40° in steps of 4°, incongruent condition). Importantly, in this condition, the tactile stimulus was behaviorally irrelevant: participants were asked to report the visually perceived heading (VPH) by placing a mouse pointer at the appropriate position on a continuous, horizontal line, after stimulus presentation.

Here, we were interested in the modulatory influence of tactile flow on VPH. To this end, we first computed the heading error, i.e., the difference between visually presented heading and VPH. In Fig. 7, each panel shows for the group of participants the average modulatory influence of the tactile flow on this heading error for a given visually simulated self-motion direction.

Figure 7.
Figure 7.

Tactile modulation of visual heading error. Each panel shows for a given heading (indicated above each panel) the heading error, i.e., the difference between visually presented heading and visually perceived heading (VPH) as a function of tactile heading direction. Red crosses indicate congruent visuo-tactile heading conditions. Error bars represent standard errors over participants.

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

To investigate the effect of the behaviorally irrelevant tactile stimulus on heading perception independent of the absolute visual heading, data from the incongruent conditions were normalized with respect to the congruent condition (see section 2. Methods for details). Figure 8 shows the modulatory influence as a function of the angular separation of visual and tactile flow. On the ordinate, negative values indicate a modulation of VPH away from the tactile flow while positive values indicate a modulation of VPH toward the tactile flow.

Figure 8.
Figure 8.

Modulation of visually perceived heading by tactilely presented heading as a function of collapsed, absolute visuo-tactile offset angles. Data was point-reflected at 0° and combined for greater statistical power. Error bars represent standard errors over participants. One sample t-tests: * = p<0.05.

Citation: Multisensory Research 35, 4 (2022) ; 10.1163/22134808-bja10071

The modulatory effect differed across angular separations between visual and tactile flow. We found small influences for offset angles up to about 30°, peaking at roughly 12°, which were statistically significant for angular separations between 8° and 20° (one-sample t-tests, p<0.05).

4. Discussion

In this study, we have tested the role of behaviorally irrelevant tactile flow for visual heading perception. We found a small, but significant modulatory influence. Importantly, this modulatory influence was tuned, reaching a peak for an angular separation of about 12°, and did not extend beyond a critical angular separation of both self-motion directions of about 30°.

4.1. Unimodal Heading Perception

In our study, participants first completed a 2AFC-task which served to examine how reliable participants can judge their self-motion direction based solely on visual or tactile flow. JNDs for visual and tactile stimuli were in the same range and not significantly different. It must be noted, however, that due to our experimental setup, the computation of the JND for the tactile domain should be considered with care. Already for the stimulus directions next to straight ahead (±8°), discrimination performance saturated, i.e., response rates were either 0% (rightward choices for headings left from straight ahead) or 100% (rightward choices for headings right from straight ahead). Given the spatial resolution of the probed headings (stimulus spacing of 8°), the resulting tactile JND should be considered as an upper bound.

In general, perceived headings were compressed toward straight ahead in both sensory modalities (centripetal bias). Undershoots of perceived visual heading have been shown before (e.g., Bremmer et al., 2017; Lich and Bremmer, 2014). On the contrary, overshoots have also been documented (Crane, 2012; Cuturi and MacNeilage, 2013). This raises the question about the cause for such seemingly contradictory results. We assume the exact experimental setup and conditions to be crucial. In Bremmer et al. (2017), stimuli were presented on a large tangent screen, simulating self-motion across a ground plane, and lasted for only 40 ms. Subjects had to indicate their perceived heading by a ruler stimulus with random numbers presented after stimulus presentation. Although the stimuli were different (3D cloud of random dots vs 2D ground plane) and presented via different means (head-mounted display), in the study by Lich and Bremmer (2014), the response task was identical (ruler stimulus). On the contrary, in the studies by Crane (2012) and Cuturi and MacNeilage (2013), which both used 3D cloud of random dot stimuli, participants had to indicate their perceived heading with a response dial. Accordingly, in both studies, participants indicated their perceived heading from a bird’s eye perspective (allocentric frame of reference), while they perceived their self-motion in an egocentric frame of reference. In contrast, in our current study, as well as in the two previous studies (Bremmer et al., 2017; Lich and Bremmer, 2014), participants perceived and responded in an egocentric frame of reference. We suggest that the change in reference frame might cause the switch from an under- to an overshoot of perceived heading around straight ahead. As an alternative explanation, the different biases might also be related to a general center-screen bias as reported early on by Warren and Kurtz (1992). Hence, further studies are required to resolve this issue.

Like for the visual domain, we found an even more pronounced undershoot of perceived heading or centripetal bias also in the tactile domain. To our best knowledge, no comparable data have been obtained before. One possible explanation for the centripetal bias might be the response format as already discussed above. When subjects used a (mouse) pointer to indicate heading, perceived heading was limited by the size of the screen (D’Avossa and Kersten, 1996; Li et al., 2002). Since we used a large presentation screen covering the central 81° × 66° of the subject’s visual field, it appears unlikely that this option has caused the observed effects. A potentially more important factor is the range of presented heading directions. From headings as presented in previous trials, subjects might have inferred that potential headings were limited to a rather limited range, adapting their responses accordingly (Crane, 2012; De Winkel et al., 2015). This, however, does not explain differences of central biases between visual and tactile stimuli. Accordingly, more experiments are needed to answer this open question.

4.2. Visual Heading Perception in the Unimodal and Bimodal Condition

Participants had to report their perceived visual heading in the pure visual and in the bimodal condition. Accordingly, the tactile flow stimulus was behaviorally irrelevant in the bimodal condition. We could show a small, but significant effect on visual heading perception, though. This finding is somewhat similar to results by Butz and colleagues (2010). These authors presented a group of dots arranged in the form of a hexagon. After 160 ms, the hexagon was rotated, either by half the angle of the separation of the dots, or by the full angle. In the latter case, the resulting motion is ambiguous (Lakatos and Shepard, 1997). Six such frames were presented, resulting in coherent clockwise (CW) or counterclockwise (CCW) rotation, or an ambiguous rotation. Rotations of the tactile stimulus, presented to the palm or the back of the hand of the participants, shifted the perceived direction of visual rotation (CW vs CCW). The shift direction also depended on whether the — behaviorally irrelevant — tactile stimulus was presented to the palm or the back of the hand. While our experiment was quite different from that of Butz and colleagues, both studies show that behaviorally irrelevant tactile stimuli can also modulate perception of visual (self-)motion.

The importance of tactile stimulation for self-motion perception is further underlined by studies with behaviorally relevant tactile stimuli. As shown by Harris and colleagues (2017), in such case, tactile stimulation can even override a visual percept of self-motion. In their study, the authors presented subjects with real oscillatory sideways self-motion (vestibular stimulation) and tactile flow across the participants’ fingers. This tactile stimulation could be presented either congruently or incongruently by varying the phase and speed of the tactile stimulus relative to the visual stimulus. Remarkably, results provided clear evidence that tactile flow dominates perceived self-motion. These and the above discussed findings strongly suggest that tactile flow is more tightly linked to self-motion perception than previously thought.

4.3. Neural Mechanisms of Tactile Influences on Visually Perceived Heading

It is well known that visual heading perception is most accurate for self-motion around straight ahead (Cuturi and MacNeilage, 2013; Sun et al., 2020). Neurophysiological recordings in the animal model of human multisensory perception, i.e., the awake behaving macaque monkey, have shown that response properties of neurons in the medial superior temporal area (area MST) and the ventral intraparietal area (area VIP) can account for this behavioral effect (e.g., Bremmer et al., 2017; Gu et al., 2010). Preferred directions of self-motion are non-uniformly distributed in both areas, with more neurons preferring sideways motion. Accordingly, straight-ahead self-motion overlaps with the part of the tuning curves of these neurons with the steepest slope. Hence, small changes in heading direction cause large changes in firing rate, thereby causing a robust and accurate heading perception. Our findings are in line with these previous results.

It is also known from neurophysiological studies in non-human primates that neurons in area VIP respond not only to visually simulated self-motion (Bremmer et al., 2002a; Chen et al., 2011), but also to tactile flow, typically resulting from self-motion (Bremmer et al., 2002b; Guipponi et al., 2015). Importantly, a functional equivalent of macaque area VIP had been identified in the human posterior parietal cortex, i.e., hVIP, suggesting similar processing of multisensory self-motion information in humans and monkeys (Bremmer et al., 2001; Field et al., 2020). Neurons in macaque area VIP have been shown to respond in an action-congruent manner, i.e., neurons preferring visually simulated forward self-motion typically also prefer tactile flow which would result from forward self-motion. Furthermore, visual and tactile receptive fields tend to overlap spatially (Avillac et al., 2004, 2005; Bremmer et al., 2002b; Duhamel et al., 1997, 1998; Sereno and Huang, 2006). A small subset of bimodal VIP neurons has been shown to react to incongruent visual and tactile stimulation as well (Avillac et al., 2007). It had been suggested that such differential encodings could be used to dissociate self- from object motion (Bremmer et al., 2002b). These previous findings might point toward area hVIP being involved in the observed perceptual effects.

Other studies have tested the spatial profile of surround suppression in demanding visual tasks (e.g., Hopf et al., 2006). These authors found that visual input was suppressed in an area surrounding the target and then recovering at more distant locations. While the exact shape and size of such a suppressive, torus-like region is not known for heading stimuli, it could be that it matches the region for which we observed the strongest effect of the tactile distractor stimuli. This then would allow the tactile distractor to impose its strongest effect on visual heading perception.

4.4. Limitation of Our Study

A limitation of our approach was the way tactile flow was provided. When asked to judge perceived vividness of the delivered heading stimuli, participants indicated the immersiveness of self-motion delivered by tactile flow as lowest across all conditions. Due to a diameter of the nozzles of 3 mm, the tactile flow might have covered a too small part of the forehead to induce a ‘natural’ impression of self-motion. However, vividness judgments of bimodally provided heading stimuli simulating congruent heading (straight ahead) revealed higher scores (stronger impression of vividness) than stimuli simulating incongruent headings. Accordingly, tactile flow contributed significantly to the percept of vection. Further studies could aim for tactile flow with a wider range of applicable airstreams in order to provide a more ‘naturalistic’ impression of self-motion.

*

Corresponding author; e-mail: rosenblu@staff.uni-marburg.de

Acknowledgements

We thank Alexander Platzner for constructing the experimental apparatus and Oliver Beckert for helping with collection of the data. This work was supported by Deutsche Forschungsgemeinschaft (CRC/TRR 135/A2, project number 222641018 and IRTG-1901-The Brain in Action) and the HMWK cluster project The Adaptive Mind.

References

  • Angelaki, D. E. (2014). How optic flow and inertial cues improve motion perception, In Cold Spring Harb. Symp. Quant. Biol. 79, 141148. DOI:10.1101/sqb.2014.79.024638.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Avillac, M., Olivier, E., Denève, S., Ben Hamed, S. and Duhamel, J.-R. (2004). Multisensory integration in multiple reference frames in the posterior parietal cortex, Cogn. Process. 5, 159166. DOI:10.1007/s10339-004-0021-3.

    • Search Google Scholar
    • Export Citation
  • Avillac, M., Ben Hamed, S. and Duhamel, J.-R. (2007). Multisensory integration in the ventral intraparietal area of the macaque monkey, J. Neurosci. 27, 19221932. DOI:10.1523/JNEUROSCI.2646-06.2007.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Brainard, D. H. (1997). The psychophysics toolbox, Spat. Vis. 10, 433436. DOI:10.1163/156856897x00357.

  • Bremmer, F., Schlack, A., Shah, N. J., Zafiris, O., Kubischik, M., Hoffmann, K.-P., Zilles, K. and Fink, G. R. (2001). Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys, Neuron 29, 287296. DOI:10.1016/S0896-6273(01)00198-2.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Bremmer, F., Duhamel, J.-R., Ben Hamed, S. and Graf, W. (2002a). Heading encoding in the macaque ventral intraparietal area (VIP), Eur. J. Neurosci. 16, 15541568. DOI:10.1046/j.1460-9568.2002.02207.x.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Bremmer, F., Klam, F., Duhamel, J.-R., Ben Hamed, S. and Graf, W. (2002b). Visual–vestibular interactive responses in the macaque ventral intraparietal area (VIP), Eur. J. Neurosci. 16(8), 15691586. DOI:10.1046/j.1460-9568.2002.02206.x.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Bremmer, F., Churan, J. and Lappe, M. (2017). Heading representations in primates are compressed by saccades, Nat. Commun. 8, 920. DOI:10.1038/s41467-017-01021-5.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Butz, M. V., Thomaschke, R., Linhardt, M. J. and Herbort, O. (2010). Remapping motion across modalities: tactile rotations influence visual motion judgments, Exp. Brain Res. 207, 111. DOI:10.1007/s00221-010-2420-2.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Chen, A., DeAngelis, G. C. and Angelaki, D. E. (2011). Representation of vestibular and visual cues to self-motion in ventral intraparietal cortex, J. Neurosci. 31, 1203612052. DOI:10.1523/JNEUROSCI.0395-11.2011.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Churan, J., Paul, J., Klingenhoefer, S. and Bremmer, F. (2017). Integration of visual and tactile information in reproduction of traveled distance, J. Neurophysiol. 118, 16501663. DOI:10.1152/jn.00342.2017.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Crane, B. T. (2012). Direction specific biases in human visual and vestibular heading perception, PLoS ONE 7, e51383. DOI:10.1371/journal.pone.0051383.

  • Cuturi, L. F. and MacNeilage, P. R. (2013). Systematic biases in human heading estimation, PLoS ONE 8, e56862. DOI:10.1371/journal.pone.0056862.

  • D’Avossa, G. and Kersten, D. (1996). Evidence in human subjects for independent coding of azimuth and elevation for direction of heading from optic flow, Vision Res. 36, 29152924. DOI:10.1016/0042-6989(96)00010-7.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • de Winkel, K. N., Katliar, M. and Bülthoff, H. H. (2015). Forced fusion in multisensory heading estimation, PLoS ONE 10, e0127104. DOI:10.1371/journal.pone.0127104.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Duhamel, J.-R., Bremmer, F., Ben Hamed, S. and Graf, W. (1997). Spatial invariance of visual receptive fields in parietal cortex neurons, Nature 389, 845848. DOI:10.1038/39865.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Duhamel, J.-R., Colby, C. L. and Goldberg, M. E. (1998). Ventral intraparietal area of the macaque: congruent visual and somatic response properties, J. Neurophysiol. 79, 126136. DOI:10.1152/jn.1998.79.1.126.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Fetsch, C. R., Pouget, A., DeAngelis, G. C. and Angelaki, D. E. (2012). Neural correlates of reliability-based cue weighting during multisensory integration, Nat. Neurosci. 15, 146154. DOI:10.1038/nn.2983.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Field, D. T., Biagi, N. and Inman, L. A. (2020). The role of the ventral intraparietal area (VIP/pVIP) in the perception of object-motion and self-motion, NeuroImage 213, 116679. DOI:10.1016/j.neuroimage.2020.116679.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Fründ, I., Haenel, N. V. and Wichmann, F. A. (2011). Inference for psychometric functions in the presence of nonstationary behavior, J. Vis. 11(6), 16. DOI:10.1167/11.6.16.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Gu, Y., Fetsch, C. R., Adeyemo, B., DeAngelis, G. C. and Angelaki, D. E. (2010). Decoding of MSTd population activity accounts for variations in the precision of heading perception, Neuron 66, 596609. DOI:10.1016/j.neuron.2010.04.026.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Guipponi, O., Cléry, J., Odouard, S., Wardak, C. and Ben Hamed, S. (2015). Whole brain mapping of visual and tactile convergence in the macaque monkey, NeuroImage 117, 93102. DOI:10.1016/j.neuroimage.2015.05.022.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Harris, L. R., Jenkin, M. and Zikovitz, D. C. (2000). Visual and non-visual cues in the perception of linear self motion, Exp. Brain Res. 135, 1221. DOI:10.1007/s002210000504.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Harris, L. R., Sakurai, K. and Beaudot, W. H. A. (2017). Tactile flow overrides other cues to self motion, Sci. Rep. 7, 1059. DOI:10.1038/s41598-017-01111-w.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hopf, J.-M., Boehler, C. N., Luck, S. J., Tsotsos, J. K., Heinze, H.-J. and Schoenfeld, M. A. (2006). Direct neurophysiological evidence for spatial suppression surrounding the focus of attention in vision, Proc. Natl Acad. Sci. USA 103, 10531058. DOI:10.1073/pnas.0507746103.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Hummel, N., Cuturi, L. F., MacNeilage, P. R. and Flanagin, V. L. (2016). The effect of supine body position on human heading perception, J. Vis. 16, 19. DOI:10.1167/16.3.19.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Kleiner, M., Brainard, D. H. and Pelli, D. G. (2007). What is new in Psychtoolbox-3, Perception 36(Suppl.), 14.

  • Lakatos, S. and Shepard, R. N. (1997). Constraints common to apparent motion in visual, tactile, and auditory space, J. Exp. Psychol. Hum. Percept. Perform. 23, 10501060. DOI:10.1037/0096-1523.23.4.1050.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Lappe, M. and Rauschecker, J. P. (1993). A neural network for the processing of optic flow from ego-motion in man and higher mammals, Neural Comput. 5, 374391. DOI:10.1162/neco.1993.5.3.374.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lappe, M. and Rauschecker, J. P. (1994). Heading detection from optic flow, Nature 369, 712713. DOI:10.1038/369712a0.

  • Lappe, M., Bremmer, F. and van den Berg, A. V. (1999). Perception of self-motion from visual flow, Trends Cogn. Sci. 3, 329336. DOI:10.1016/S1364-6613(99)01364-9.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Li, L., Peli, E. and Warren, W. H. (2002). Heading perception in patients with advanced retinitis pigmentosa, Optom. Vis. Sci. 79, 581589. DOI:10.1097/00006324-200209000-00009.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Lich, M. and Bremmer, F. (2014). Self-motion perception in the elderly, Front. Hum. Neurosci. 8, 681. DOI:10.3389/fnhum.2014.00681.

  • Matthis, J. S., Muller, K. S., Bonnen, K. and Hayhoe, M. M. (2021). Retinal optic flow during natural locomotion, BioRxiv 2020–07.23.217893. DOI:10.1101/2020.07.23.217893.

    • Crossref
    • PubMed
    • Export Citation
  • Rodriguez, R. and Crane, B. T. (2020). Common causation and offset effects in human visual-inertial heading direction integration, J. Neurophysiol. 123, 13691379. DOI:10.1152/jn.00019.2020.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Seno, T., Ogawa, M., Ito, H. and Sunaga, S. (2011). Consistent air flow to the face facilitates vection, Perception 40, 12371240. DOI:10.1068/p7055.

  • Sereno, M. I. and Huang, R.-S. (2006). A human parietal face area contains aligned head-centered visual and tactile maps, Nat. Neurosci. 9, 13371343. DOI:10.1038/nn1777.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Stein, B. E. and Stanford, T. R. (2008). Multisensory integration: current issues from the perspective of the single neuron, Nat. Rev. Neurosci. 9, 255266. DOI:10.1038/nrn2331.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • Sun, Q., Zhang, H., Alais, D. and Li, L. (2020). Serial dependence and center bias in heading perception from optic flow, J. Vis. 20, 1. DOI:10.1167/jov.20.10.1.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • Telford, L., Howard, I. P. and Ohmi, M. (1995). Heading judgments during active and passive self-motion, Exp. Brain Res. 104, 502510. DOI:10.1007/BF00231984.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • von Hopffgarten, A. and Bremmer, F. (2011). Self-motion reproduction can be affected by associated auditory cues, See. Perceiv. 24, 203222. DOI:10.1163/187847511X571005.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Warren, W. H. and Hannon, D. J. (1988). Direction of self-motion is perceived from optical flow, Nature 336, 162163. DOI:10.1038/336162a0.

  • Warren, W. H. and Kurtz, K. J. (1992). The role of central and peripheral vision in perceiving the direction of self-motion, Percept. Psychophys. 51, 443454. DOI:10.3758/BF03211640.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Weech, S., Varghese, J. P. and Barnett-Cowan, M. (2018). Estimating the sensorimotor components of cybersickness, J. Neurophysiol. 120, 22012217. DOI:10.1152/jn.00477.2018.

    • Crossref
    • Search Google Scholar
    • Export Citation

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 159 159 0
Full Text Views 364 364 26
PDF Views & Downloads 448 448 40