Search Results

Restricted Access

Vanessa Harrar and Charles Spence

When deciding on a product’s quality, we often pick it up to gauge its weight. If it’s heavy enough, we tend to think that it is good quality. We have recently shown that the weight of a dish can affect the taste and quality perception of the food it contains. Here, we varied the weight of spoons in order to determine whether the weight or size of the cutlery might influence taste perception. Teaspoons and tablespoons were tested, with one of each spoon-size artificially weighted with lead hidden into the handle (teaspoons: 2.35 and 5.67 g, and tablespoons: 3.73 and 10.84 g). Participants tasted yoghurt from each spoon and rated the yoghurt’s perceived density, price, sweetness, and pleasantness. Four within-participant ANOVAs were used to test the effects of spoon size and spoon weight on each attribute. The perceived density of the yoghurt was affected by the spoon’s weight, with yoghurt from light spoons being perceived as thicker than yoghurt sampled from a heavy spoon. The perceived price of the yoghurt also varied with spoon weight such that lighter spoons made the yoghurt taste more expensive. The most reliable effect was an interaction between spoon weight and spoon size on sweetness perception: heavy teaspoons and light tablespoons made the yoghurt appear sweeter. These data support the growing body of research demonstrating that tableware (and silverware) can affect the consumer’s judgements without their being aware.

Restricted Access

Edited by Vanessa Harrar, Liverpool University and Charles Spence

Restricted Access

Vanessa Harrar, Charles Spence and Tamar R. Makin

The body is represented in a somatotopic framework such that adjacent body parts are represented next to each other in the brain. We utilised the organisation of the somatosensory cortex to study the generalisation pattern of tactile perceptual learning. Perceptual learning refers to the process of long-lasting improvement in the performance of a perceptual task following persistent sensory exposure. In order to test if perceptual learning generalises to neighbouring brain/body areas, 12 participants were trained on a tactile discrimination task on one fingertip (using tactile oriented gratings) over the course of four days. Thresholds for tactile acuity were estimated prior to, and following, the training for the ‘trained’ finger and three additional fingers: ‘adjacent’, ‘homologous’ (the same finger as trained but on the opposite hand) and ‘other’ (which was neither adjacent nor homologous to the trained finger). Identical threshold estimating with no training was also carried out for a control group. Following training, tactile thresholds were improved (as compared to the control group). Importantly, improved performance was not exclusive for the trained finger; it generalised to the adjacent and homologous fingers, but not the other finger. We found that perceptual learning indeed generalises in a way that can be predicted by the topography of the somatosensory cortex, suggesting that sensory experience is not necessary for perceptual learning. These findings may be translated to rehabilitation procedures that train the partially-deprived cortex using similar principles of perceptual learning generalisation, such as following amputation or blindness in adults.

Restricted Access

Liesbet Goubert, Sophie Vandenbroucke and Vanessa Harrar

Restricted Access

Laurence R. Harris, Charles Spence and Vanessa Harrar

Restricted Access

Cesare V. Parise, Vanessa Harrar, Marc O. Ernst and Charles Spence

Humans are equipped with multiple sensory channels that provide both redundant and complementary information about the objects and events in the world around them. A primary challenge for the brain is therefore to solve the ‘correspondence problem’, that is, to bind those signals that likely originate from the same environmental source, while keeping separate those unisensory inputs that likely belong to different objects/events. Whether multiple signals have a common origin or not must, however, be inferred from the signals themselves through a causal inference process.

Recent studies have demonstrated that cross-correlation, that is, the similarity in temporal structure between unimodal signals, represents a powerful cue for solving the correspondence problem in humans. Here we provide further evidence for the role of the temporal correlation between auditory and visual signals in multisensory integration. Capitalizing on the well-known fact that sensitivity to crossmodal conflict is inversely related to the strength of coupling between the signals, we measured sensitivity to crossmodal spatial conflicts as a function of the cross-correlation between the temporal structures of the audiovisual signals. Observers’ performance was systematically modulated by the cross-correlation, with lower sensitivity to crossmodal conflict being measured for correlated as compared to uncorrelated audiovisual signals. These results therefore provide support for the claim that cross-correlation promotes multisensory integration. A Bayesian framework is proposed to interpret the present results, whereby stimulus correlation is represented on the prior distribution of expected crossmodal co-occurrence.

Restricted Access

John Stein, Vanessa Harrar, Alexis Pérez-Bellido, Jonathan Tammam, Anna Pitt, Rachel Hulatt and Charles Spence

Restricted Access

Vanessa Harrar, Sarah D’Amour, Michael J. Carnevale, Laurence R. Harris and Lisa Pritchett

Restricted Access

Sophie Vandenbroucke, Geert Crombez, Dimitri Van Ryckeghem, Vanessa Harrar, Liesbet Goubert, Charles Spence, Wouter Durnez and Stefaan van Damme

Introduction: There is preliminary evidence that viewing touch or pain can modulate the experience of tactile stimulation. The aim of this study was to develop an experimental paradigm to investigate whether the observation of needle pricks to another person’s hand facilitates the detection of subtle somatic sensations. Furthermore, differences between control persons and persons reporting synaesthesia for pain (i.e., experiencing observed pain as if it is their own pain) will be examined.

Method: Synaesthetes (n=15) and controls (n=20) were presented a series of videos showing left or right hands being pricked and control videos (e.g., a sponge being pricked), whilst receiving occasionally subtle threshold sensations themselves on the hand in the same spatial location (congruent trials) or in the opposite location (incongruent trials) as the visual stimuli. Participants were asked to detect the sensory stimulus. Signal detection theory was used to compare whether sensitivity was different for both groups and both categories of visual stimuli.

Results: Overall, perceptual sensitivity (d′) was significantly higher when the visual stimuli involved a painful situation (e.g., needle pricking another’s hand) compared to the control videos, and was significantly lower in synaesthetes compared to control participants. When no sensory stimulus was administered, participants reported significantly more illusory sensations when a painful situation was depicted compared to a non-painful situation.

Discussion: This study suggests that the detection of somatic sensations can be facilitated or inhibited by observing visual stimuli. Synaesthetes were generally less sensitive, suggesting that they experience more difficulties in disentangling somatic and visual stimuli.