Search Results

You are looking at 1 - 10 of 11 items for

  • Author or Editor: Massimiliano Zampini x
  • Search level: All x
Clear All

Audition and touch interact with one another and share a number of similarities; however, little is known about their interplay in the perception of temporal duration. The present study intended to investigate whether the temporal duration of an irrelevant auditory or tactile stimulus could modulate the perceived duration of a target stimulus presented in the other modality (i.e., tactile or auditory) adopting both a between-participants (Experiment 1) and a within-participants (Experiment 2) experimental design. In a two-alternative forced-choice task, participants decided which of two events in a target modality was longer. The simultaneously distractor stimuli were presented with a duration that was either congruent or incongruent to the target’s. Results showed that both the auditory and tactile modalities affected duration judgments in the incongruent condition, decreasing performance in both experiments. Moreover, in Experiment 1, the tactile modality enhanced the perception of auditory stimuli in the congruent condition, but audition did not facilitate performance for the congruent condition in the tactile modality; this tactile enhancement of audition was not found in Experiment 2. To the best of our knowledge, this is the first study documenting audiotactile interactions in the perception of duration, and suggests that audition and touch might modulate one another in a more balanced manner, in contrast to audiovisual pairings. The findings support previous evidence as to the shared links and reciprocal influences when audition and touch interact with one another.

In: Multisensory Research

Our recent findings have shown that sounds improve visual detection in low vision individuals when the audiovisual pairs are presented simultaneously. The present study purports to investigate possible temporal aspects of the audiovisual enhancement effect that we have previously reported. Low vision participants were asked to detect the presence of a visual stimulus (yes/no task) either presented in isolation or together with an auditory stimulus at different SOAs. In the first experiment, when the sound was always leading the visual stimuli, there was a significant visual detection enhancement even when the visual stimulus was temporally delayed by 400 ms. However, the visual detection improvement was reduced in the second experiment when the sound could randomly lead or lag the visual stimulus. A significant enhancement was found only when the audiovisual stimuli were synchronized. Taken together, the results of the present study seem to suggest that high-level associations between modalities might modulate audiovisual interactions in low vision individuals.

In: Seeing and Perceiving

A recent study (Tsakiris et al., 2011) suggested that lower interoceptive sensitivity, as assessed by heat-rate estimation, predicts malleability of body representations, as measured by proprioceptive drift and ownership in a rubber hand illusion (RHI) task. The authors suggested that one explanation of their finding is linked to the notion of limited attentional resources: individuals with high interoceptive sensitivity are more aware of internal states and, in turns, they have less attentional resources available for multisensory processing. If this is the case, the competition between interoceptive and multisensory processing should be strongest when they are concurrent. Here we tested this prediction using a visuo-proprioceptive conflict produced through prismatic goggles, without affecting body ownership (unlike the RHI). In three experiments, participants looked at their own hand while wearing neutral or prismatic goggles (visual field shifted 20° leftwards). Meanwhile, they performed a concurrent counting tasks on interoceptive (Exp. 1–2: heart-beats; Exp. 3: breaths) or exteroceptive signals (pure-tones). A no-task condition was also included. We measured proprioceptive drift in each condition an indicator of illusion strength. All experiments documented a significant drift of perceived hand position after prism exposure. This bodily illusion, however, was never affected by the concurrent task, regardless of whether it involved interoceptive or exteroceptive signals. These result reveal that multisensory integration underlying body perception is unaffected by concurrent tasks capturing attentional resources, strongly suggesting a low-level and automatic phenomenon. Furthermore, they indicate that the origin of increased body malleability in individuals with low interoceptive awareness is not competition for attentional resources.

In: Seeing and Perceiving

In two behavioral experiments, we explored effects of long-term musical training on the implicit processing of temporal structures (rhythm, non-rhythm and meter), manipulating deviance detection under different conditions. We used a task that did not require an explicit processing of the temporal aspect of stimuli, as this was irrelevant for the task. In Experiment 1, we investigated whether long-term musical training results in a superior processing of auditory rhythm, and thus boosts the detection of auditory deviants inserted within rhythmic compared to non-rhythmic auditory series. In Experiment 2, we focused on the influence of the metrical positions of a rhythmic series, and we compared musicians and non-musicians’ responses to deviant sounds inserted on strong versus weak metrical positions. We hypothesized that musicians would show enhanced rhythmic processing as compared to non-musicians. Furthermore, we hypothesized that musicians’ expectancy level would differ more across metrical positions compared to non-musicians. In both experiments, musicians were faster and more sensitive than non-musicians. Although both groups were overall faster and showed a higher sensitivity for the detection of deviants in rhythmic compared to non-rhythmic series (Experiment 1), only musicians were faster in the detection of deviants on strong positions compared to weak ones (Experiment 2). While rhythm modulates deviance processing also in non-musicians, specific effects of long-term musical training arise when a refined comparison of hierarchical metrical positions is considered. This suggests that long-term musical training enhances sensitivity to the metrical structure and improves temporal prediction mechanisms, even during implicit processing of meter.

In: Timing & Time Perception

The label ‘crossmodal correspondences’ has been used to define the nonarbitrary associations that appear to exist between different basic physical stimulus attributes in different sensory modalities. For instance, it has been consistently shown in the neurotypical population that higher pitched sounds are more frequently matched with visual patterns which are brighter, smaller, and sharper than those associated to lower pitched sounds. Some evidence suggests that patients with ASDs tend not to show this crossmodal preferential association pattern (e.g., curvilinear shapes and labial/lingual consonants vs. rectilinear shapes and plosive consonants). In the present study, we compared the performance of children with ASDs (6–15 years) and matched neurotypical controls in a non-verbal crossmodal correspondence task. The participants were asked to indicate which of two bouncing visual patterns was making a centrally located sound. In intermixed trials, the visual patterns varied in either size, surface brightness, or shape, whereas the sound varied in pitch. The results showed that, whereas the neurotypical controls reliably matched the higher pitched sound to a smaller and brighter visual pattern, the performance of participants with ASDs was at chance level. In the condition where the visual patterns differed in shape, no inter-group difference was observed. Children’s matching performance cannot be attributed to intensity matching or difficulties in understanding the instructions, which were controlled. These data suggest that the tendency to associate congruent visual and auditory features vary as a function of the presence of ASDs, possibly pointing to poorer capabilities to integrate auditory and visual inputs in this population.

In: Seeing and Perceiving

Despite the large number of studies on the multisensory aspects of tactile perception, very little is known regarding the effects of visual and auditory sensory modalities on the tactile hedonic evaluation of textures, especially when the presentation of the stimuli is mediated by a haptic device. In this study, different haptic virtual surfaces were rendered by varying the static and dynamic frictional coefficients of a Geomagic® Touch device. In Experiment 1, the haptic surfaces were paired with pictures representing everyday materials (glass, plastic, rubber and steel); in Experiment 2, the haptic surfaces were paired with sounds resulting from the haptic exploration of paper or sandpaper. In both the experiments, participants were required to rate the pleasantness and the roughness of the virtual surfaces explored. Exploration times were also recorded. Both pleasantness and roughness judgments, as well as the durations of exploration, varied as a function of the combinations of the visuo-tactile and the audio-tactile stimuli presented. Taken together, these results suggest that vision and audition modulate haptic perception and hedonic preferences when tactile sensations are provided through a haptic device. Importantly, these results offer interesting suggestions for designing more pleasant, and even more realistic, multisensory virtual surfaces.

In: Multisensory Research

Over the last decade, scientists working on the topic of multisensory integration, as well as designers and marketers involved in trying to understand consumer behavior, have become increasingly interested in the non-arbitrary associations (e.g., sound symbolism) between different sensorial attributes of the stimuli they work with. Nevertheless, to date, little research in this area has investigated the presence of these crossmodal correspondences in the tactile evaluation of everyday materials. Here, we explore the presence and nature of the associations between tactile sensations, the sound of non-words, and people’s emotional states. Samples of cotton, satin, tinfoil, sandpaper, and abrasive sponge, were stroked along the participants’ forearm at the speed of 5 cm/s. Participants evaluated the materials along several dimensions, comprising scales anchored by pairs of non-words (e.g., Kiki/Bouba) and adjectives (e.g., ugly/beautiful). The results revealed that smoother textures were associated with non-words made up of round-shaped sounds (e.g., Maluma), whereas rougher textures were more strongly associated with sharp-transient sounds (e.g., Takete). The results also revealed the presence of a number of correspondences between tactile surfaces and adjectives related to visual and auditory attributes. For example, smooth textures were associated with features evoked by words such as ‘bright’ and ‘quiet’; by contrast, the rougher textures were associated with adjectives such as ‘dim’ and ‘loud’. The textures were also found to be associated with a number of emotional labels. Taken together, these results further our understanding of crossmodal correspondences involving the tactile modality and provide interesting insights in the applied field of design and marketing.

In: Multisensory Research

The present study aims to assess the mechanisms involved in the processing of potentially threatening stimuli presented within the peri-head space of humans. Magnetic fields evoked by air-puffs presented at the peri-oral area of fifteen participants were recorded by using magnetoencephalography (MEG). Crucially, each air puff was preceded by a sound, which could be either perceived as looming, stationary and close to the body (i.e., within the peri-head space) or stationary and far from the body (i.e., extrapersonal space). The comparison of the time courses of the global field power (GFP) indicated a significant difference in the time window ranging from 70 to 170 ms between the conditions. When the air puff was preceded by a stationary sound located far from the head stronger somatosensory activity was evoked as compared to the conditions where the sounds were located close to the head. No difference could be shown for the looming and the stationary prime stimulus close to the head. Source localization was performed assuming a pair of symmetric dipoles in a spherical head model that was fitted to the MRI images of the individual participants. Results showed sources in primary and secondary somatosensory cortex. Source activities in secondary somatosensory cortex differed between the three conditions, with larger effects evoked by the looming sounds and smaller effects evoked by the far stationary sounds, and the close stationary sounds evoking intermediate effects. Overall, these findings suggest the existence of a system involved in the detection of approaching objects and protecting the body from collisions in humans.

In: Seeing and Perceiving

Abstract

Preliminary evidence showed a reduced temporal sensitivity (i.e., larger temporal binding window) to audiovisual asynchrony in obesity. Our aim was to extend this investigation to visuotactile stimuli, comparing individuals of healthy weight and with obesity in a simultaneity judgment task. We verified that individuals with obesity had a larger temporal binding window than healthy-weight individuals, meaning that they tend to integrate visuotactile stimuli over an extended range of stimulus onset asynchronies. We point out that our finding gives evidence in support of a more pervasive impairment of the temporal discrimination of co-occurrent stimuli, which might affect multisensory integration in obesity. We discuss our results referring to the possible role of atypical oscillatory neural activity and structural anomalies in affecting the perception of simultaneity between multisensory stimuli in obesity. Finally, we highlight the urgency of a deeper understanding of multisensory integration in obesity at least for two reasons. First, multisensory bodily illusions might be used to manipulate body dissatisfaction in obesity. Second, multisensory integration anomalies in obesity might lead to a dissimilar perception of food, encouraging overeating behaviours.

In: Multisensory Research