Search Results

You are looking at 1 - 8 of 8 items for

  • Author or Editor: Jamie Ward x
  • Search level: All x
Clear All
Author:

There is a structure to the associations found in people with synaesthesia: for instance, high pitch is small, bright, and high in space. The same correspondences exist in normal multi-sensory associations and manifest themselves implicitly as behavioural facilitation/interference on experimental tasks, and overtly when people are asked to ‘freely’ associate using imagery or metaphor. This has been termed weak synaesthesia; the implication being that it lies on a continuum with synaesthesia itself. In this presentation, I summarise evidence that compares multi-sensory associations from synaesthetes versus others. In addition to highlighting similarities, I also consider ways in which they differ. For instance, I propose that whilst the alignment of these associations is normally relative (e.g., 500 Hz could be bright when compared against 200 Hz but dark when compared against 1000 Hz) it is absolute in synaesthesia itself (i.e., 500 Hz has a non-relative level of brightness). Thus, there is similarity but not continuity between synaesthetes and others.

In: Seeing and Perceiving

People with synaesthesia show an enhanced memory relative to demographically matched controls. The most obvious explanation for this is that the ‘extra’ perceptual experiences lead to richer encoding and retrieval opportunities of material that induces synaesthesia (typically verbal material). Although there is some evidence for this, it is unlikely to be the whole explanation. For instance, not all material that triggers synaesthesia is better remembered (e.g., digit span) and some material that does not trigger synaesthesia is better remembered. In fact, they tend to have better visual memory than verbal memory. We suggest that enhanced memory in synaesthesia is linked to wider changes in cognitive systems at the interface of perception and memory and link this to recent findings in the neuroscience of memory.

In: Seeing and Perceiving

Visual sensory substitution devices (SSDs) allow visually-deprived individuals to navigate and recognise the ‘visual world’; SSDs also provide opportunities for psychologists to study modality-independent theories of perception. At present most research has focused on encoding greyscale vision. However at the low spatial resolutions received by SSD users, colour information enhances object-ground segmentation, and provides more stable cues for scene and object recognition. Many attempts have been made to encode colour information in tactile or auditory modalities, but many of these studies exist in isolation. This review brings together a wide variety of tactile and auditory approaches to representing colour. We examine how each device constructs ‘colour’ relative to veridical human colour perception and report previous experiments using these devices. Theoretical approaches to encoding and transferring colour information through sound or touch are discussed for future devices, covering alternative stimulation approaches, perceptually distinct dimensions and intuitive cross-modal correspondences.

In: Multisensory Research

In this study, we present three experiments investigating the influence of visual movement on auditory judgements. In Experiments 1 and 2, two bursts of noise were presented and participants were required to judge which was louder using a forced-choice task. One of the two bursts was accompanied by a moving disc. The other burst either was accompanied by no visual stimulus (Experiment 1) or by a static disc (Experiment 2). When the two sounds were of identical intensity participants judged the sound accompanied by the moving disc as louder. The effect was greater when auditory stimuli were of the same intensity but it was still present for mid-to-high intensities. In a third, control, experiment participants judged the pitch (and not the loudness) of a pair of tones. Here the pattern was different: there was no effect of visual motion for sounds of the same pitch, with a reversed effect for mid-to-high pitch differences (the effect of motion lowered the pitch). This showed no shift of response towards the interval accompanied by the moving disc. In contrast, the effect on pitch was reversed in comparison to what observed for loudness, with mid-to-high frequency sound accompanied by motion rated as lower in pitch respect to the static intervals.

The natural tendency for moving objects to elicit sounds may lead to an automatic perceptual influence of vision over sound particularly when the latter is ambiguous. This is the first account of this novel audio-visual interaction.

In: Multisensory Research

Visual sensory substitution devices (SSDs) can represent visual characteristics through distinct patterns of sound, allowing a visually impaired user access to visual information. Previous SSDs have avoided colour and when they do encode colour, have assigned sounds to colour in a largely unprincipled way. This study introduces a new tablet-based SSD termed the ‘Creole’ (so called because it combines tactile scanning with image sonification) and a new algorithm for converting colour to sound that is based on established cross-modal correspondences (intuitive mappings between different sensory dimensions). To test the utility of correspondences, we examined the colour–sound associative memory and object recognition abilities of sighted users who had their device either coded in line with or opposite to sound–colour correspondences. Improved colour memory and reduced colour-errors were made by users who had the correspondence-based mappings. Interestingly, the colour–sound mappings that provided the highest improvements during the associative memory task also saw the greatest gains for recognising realistic objects that also featured these colours, indicating a transfer of abilities from memory to recognition. These users were also marginally better at matching sounds to images varying in luminance, even though luminance was coded identically across the different versions of the device. These findings are discussed with relevance for both colour and correspondences for sensory substitution use.

In: Multisensory Research

Sensory substitution is the representation of information from one sensory modality (e.g., vision) within another modality (e.g., audition). We used a visual-to-auditory sensory substitution device (SSD) to explore the effect of incongruous (true-)visual and substituted-visual signals on visual attention. In our multisensory sensory substitution paradigm, both visual and sonified-visual information were presented. By making small alterations to the sonified image, but not the seen image, we introduced audio–visual mismatch. The alterations consisted of the addition of a small image (for instance, the Wally character from the ‘Where’s Wally?’ books) within the original image. Participants were asked to listen to the sonified image and identify which quadrant contained the alteration. Monitoring eye movements revealed the effect of the audio–visual mismatch on covert visual attention. We found that participants consistently fixated more, and dwelled for longer, in the quadrant corresponding to the location (in the sonified image) of the target. This effect was not contingent on the participant reporting the location of the target correctly, which indicates a low-level interaction between an auditory stream and visual attention. We propose that this suggests a shared visual workspace that is accessible by visual sources other than the eyes. If this is indeed the case, it would support the development of other, more esoteric, forms of sensory substitution. These could include an expanded field of view (e.g., rear-view cameras), overlaid visual information (e.g., thermal imaging) or restoration of partial visual field loss (e.g., hemianopsia).

In: Seeing and Perceiving

There is a widespread tendency to associate certain properties of sound with those of colour (e.g., higher pitches with lighter colours). Yet it is an open question how sound influences chroma or hue when properly controlling for lightness. To examine this, we asked participants to adjust physically equiluminant colours until they ‘went best’ with certain sounds. For pure tones, complex sine waves and vocal timbres, increases in frequency were associated with increases in chroma. Increasing the loudness of pure tones also increased chroma. Hue associations varied depending on the type of stimuli. In stimuli that involved only limited bands of frequencies (pure tones, vocal timbres), frequency correlated with hue, such that low frequencies gave blue hues and progressed to yellow hues at 800 Hz. Increasing the loudness of a pure tone was also associated with a shift from blue to yellow. However, for complex sounds that share the same bandwidth of frequencies (100–3200 Hz) but that vary in terms of which frequencies have the most power, all stimuli were associated with yellow hues. This suggests that the presence of high frequencies (above 800 Hz) consistently yields yellow hues. Overall we conclude that while pitch–chroma associations appear to flexibly re-apply themselves across a variety of contexts, frequencies above 800 Hz appear to produce yellow hues irrespective of context. These findings reveal new sound–colour correspondences previously obscured through not controlling for lightness. Findings are discussed in relation to understanding the underlying rules of cross-modal correspondences, synaesthesia, and optimising the sensory substitution of visual information through sound.

In: Multisensory Research

Savant syndrome is a condition where prodigious talent co-occurs with developmental difficulties such as autism spectrum conditions (ASC). To better understand savant skills, we previously proposed a link with synaesthesia: that savant syndrome may arise in ASC individuals who also happen to have synaesthesia. A second, unrelated claim is that people with autism may have higher rates of synaesthesia. Here we ask whether synaesthesia is indeed found more often in autism per se, or only in cases where autism co-occurs with savant skills. People with autism in previous studies when tested for synaesthesia were not differentiated into those with and without savant abilities. Here we tested three groups: people with autism who also have savant skills ( n = 40 ), people with autism without savant skills ( n = 34 ), and controls without autism ( n = 29 ). We used a validated test to diagnose grapheme–colour synaesthesia. Results show a significantly higher prevalence of synaesthesia in people with ASC, but only those who also have savant skills. This suggests that synaesthesia in autism is linked to those with savant abilities rather than autism per se. We discuss the role of synaesthesia in the development of prodigious talent.

Open Access
In: Multisensory Research