The last 50 years or so has seen great optimism concerning the potential of sensory substitution and augmentation devices to enhance the lives of those with (or even those without) some form of sensory loss (in practice, this has typically meant those who are blind or suffering from low vision). One commonly discussed solution for those individuals who are blind has been to use one of a range of tactile–visual sensory substitution systems that represent objects captured by a camera as outline images on the skin surface in real-time (what Loomis, Klatzky and Giudice, 2012, term general-purpose sensory substitution devices). However, despite the fact that touch, like vision, initially codes information spatiotopically, I would like to argue that a number of fundamental perceptual, attentional, and cognitive limitations constraining the processing of tactile information mean that the skin surface is unlikely ever to provide such general-purpose sensory substitution capabilities. At present, there is little evidence to suggest that the extensive cortical plasticity that has been demonstrated in those who have lost (or never had) a sense can do much to overcome the limitations associated with trying to perceive high rates of spatiotemporally varying information presented via the skin surface (no matter whether that surface be the back, stomach, forehead, or tongue). Instead, the use of the skin will likely be restricted to various special-purpose devices that enable specific activities such as navigation, the control of locomotion, pattern perception, etc.
Charles Spence reviews Rob DeSalle’s (2018) new book Our Senses: Gateways to Consciousness, a popular science look at the neuroscience behind the senses set in an evolutionary context.
Klemens Knöferle and Charles Spence
Searching for a particular product in a supermarket can be a challenging business. The question therefore arises as to whether cues from the shopper’s other senses can be used to facilitate, guide, or bias visual search toward a particular product or product type. Prior research suggests that characteristic sounds can facilitate visual object localization (Iordanescu et al., 2008, 2010). Extending these findings to an applied setting, we investigated whether product-related sounds would facilitate visual search for products from different categories (e.g., champagne, potato crisps, deodorant) when arranged on a virtual shelf.
On each trial, participants were visually presented with the name of a target product and then located the target within a virtual shelf display containing pictures of four different products (randomly selected from a set of nine). The visual display was randomly accompanied by a target-congruent, a target-incongruent, an unrelated, or no sound. Congruent sounds were semantically related to the target (e.g., uncorking a champagne bottle), incongruent sounds were related to the product shown in the corner opposite to the target, and unrelated sounds did not correspond to any of the products shown in the display.
Participants found the target product significantly faster when the sound was congruent rather than incongruent with the target. All other pairwise comparisons were non-significant.
These results extend the facilitatory crossmodal effect of characteristic sounds on visual search performance described earlier to the more realistic context of a virtual shelf display, showing that characteristic sounds can crossmodally enhance the visual processing of actual products.
Vanessa Harrar and Charles Spence
When deciding on a product’s quality, we often pick it up to gauge its weight. If it’s heavy enough, we tend to think that it is good quality. We have recently shown that the weight of a dish can affect the taste and quality perception of the food it contains. Here, we varied the weight of spoons in order to determine whether the weight or size of the cutlery might influence taste perception. Teaspoons and tablespoons were tested, with one of each spoon-size artificially weighted with lead hidden into the handle (teaspoons: 2.35 and 5.67 g, and tablespoons: 3.73 and 10.84 g). Participants tasted yoghurt from each spoon and rated the yoghurt’s perceived density, price, sweetness, and pleasantness. Four within-participant ANOVAs were used to test the effects of spoon size and spoon weight on each attribute. The perceived density of the yoghurt was affected by the spoon’s weight, with yoghurt from light spoons being perceived as thicker than yoghurt sampled from a heavy spoon. The perceived price of the yoghurt also varied with spoon weight such that lighter spoons made the yoghurt taste more expensive. The most reliable effect was an interaction between spoon weight and spoon size on sweetness perception: heavy teaspoons and light tablespoons made the yoghurt appear sweeter. These data support the growing body of research demonstrating that tableware (and silverware) can affect the consumer’s judgements without their being aware.
Argiro Vatakis and Charles Spence
Research has revealed different temporal integration windows between and within different speech-tokens. The limited speech-tokens tested to date has not allowed for the proper evaluation of whether such differences are task or stimulus driven? We conducted a series of experiments to investigate how the physical differences associated with speech articulation affect the temporal aspects of audiovisual speech perception. Videos of consonants and vowels uttered by three speakers were presented. Participants made temporal order judgments (TOJs) regarding which speech-stream had been presented first. The sensitivity of participants’ TOJs and the point of subjective simultaneity (PSS) were analyzed as a function of the place, manner of articulation, and voicing for consonants, and the height/backness of the tongue and lip-roundedness for vowels. The results demonstrated that for the case of place of articulation/roundedness, participants were more sensitive to the temporal order of highly-salient speech-signals with smaller visual-leads at the PSS. This was not the case when the manner of articulation/height was evaluated. These findings suggest that the visual-speech signal provides substantial cues to the auditory-signal that modulate the relative processing times required for the perception of the speech-stream. A subsequent experiment explored how the presentation of different sources of visual-information modulated such findings. Videos of three consonants were presented under natural and point-light (PL) viewing conditions revealing parts, or the whole, face. Preliminary analysis revealed no differences in TOJ accuracy under different viewing conditions. However, the PSS data revealed significant differences in viewing conditions depending on the speech token uttered (e.g., larger visual-leads for PL-lip/teeth/tongue-only views).
Ophelia Deroy and Charles Spence
The renewed interest that has emerged around the topic of crossmodal correspondences in recent years has demonstrated that crossmodal matchings and mappings exist between the majority of sensory dimensions, and across all combinations of sensory modalities. This renewed interest also offers a rapidly-growing list of ways in which correspondences affect — or interact with — metaphorical understanding, feelings of ‘knowing’, behavioral tasks, learning, mental imagery, and perceptual experiences. Here we highlight why, more generally, crossmodal correspondences matter to theories of multisensory interactions.
Qian (Janice) Wang and Charles Spence
We explored the putative existence of crossmodal correspondences between sound attributes and beverage temperature. An online pre-study was conducted first, in order to determine whether people would associate the auditory parameters of pitch and tempo with different imagined beverage temperatures. The same melody was manipulated to create a matrix of 25 variants with five different levels of both pitch and tempo. The participants were instructed to imagine consuming hot, room-temperature, or cold water, then to choose the melody that best matched the imagined drinking experience. The results revealed that imagining drinking cold water was associated with a significantly higher pitch than drinking both room-temperature and hot water, and with significantly faster tempo than room-temperature water. Next, the online study was replicated with participants in the lab tasting samples of hot, room-temperature, and cold water while choosing a melody that best matched the actual tasting experience. The results confirmed that, compared to room-temperature and hot water, the experience of cold water was associated with both significantly higher pitch and fast tempo. Possible mechanisms and potential applications of these results are discussed.
Betina Piqueras-Fiszman and Charles Spence
Most of the published research on the perception of food and drink has focused on what happens in-mouth during consumption. It is, however, important to note that people’s judgments are also profoundly influenced by other sensory cues, such as haptic input, be it their direct (oral-somatosensory) contact with the food itself, or their indirect contact with the product packaging, plateware, or cutlery as well. A series of experiments are reported which together demonstrate that people also evaluate the sensory characteristics, and even the quality and estimated price, of foods and beverages based on attributes, such as the weight or the texture, of the items we utilize during consumption (be it the cutlery, the tableware, or the product packaging). For instance, yoghurt samples were rated as being significantly more dense and more satiating when consumed from a heavy bowl than when exactly the same yoghurt was consumed from an identical bowl that was somewhat lighter. In another study, the texture of a yoghurt pot was shown to influence participants’ ratings of certain of the textural attributes of foods. We have also investigated the effect of the weight of the cutlery. These results suggest that the haptic cues associated with the consumption of food and drink can also influence our in-mouth perception of their textural properties. Given that the participants did not touch the food directly with their hands, the phenomenon observed might reflect an example of ‘sensation transference’ between what participants feel in their hands and what they perceive in their mouths.
Anne-Sylvie Crisinel and Charles Spence
We report a series of experiments investigating crossmodal correspondences between various food-related stimuli (water-based solutions, milk-based flavoured solutions, crisps, chocolate and odours) and sounds varying in pitch and played by four different types of musical instruments. Participants tasted or smelled stimuli before matching them to a musical note. Our results demonstrate that participants preferentially match certain stimuli to specific pitches and instrument types. Through participants’ ratings of the stimuli along a number of dimensions (e.g., pleasantness, complexity, familiarity or sweetness), we explore the psychological dimensions involved in these crossmodal correspondences, using principal components analysis (PCA). While pleasantness seems to play an important role in the choice of instrument associated with chemosensory stimuli, the pitch seems to also depend on the quality of the taste (bitter, salty, sour or sweet). The level at which such crossmodal correspondences might occur, as well as the potential applications of such results, will be discussed.