Visual and Haptic Representations of Material Properties

in Multisensory Research
Restricted Access
Get Access to Full Text
Rent on DeepDyve

Have an Access Token?



Enter your access token to activate and access content online.

Please login and go to your personal user account to enter your access token.



Help

Have Institutional Access?



Access content through your institution. Any other coaching guidance?



Connect

Research on material perception has received an increasing amount of attention recently. Clearly, both the visual and the haptic sense play important roles in the perception of materials, yet it is still unclear how both senses compare in material perception tasks. Here, we set out to investigate the degree of correspondence between the visual and the haptic representations of different materials. We asked participants to both categorize and rate 84 different materials for several material properties. In the haptic case, participants were blindfolded and asked to assess the materials based on haptic exploration. In the visual condition, participants assessed the stimuli based on their visual impressions only. While categorization performance was less consistent in the haptic condition than in the visual one, ratings correlated highly between the visual and the haptic modality. PCA revealed that all material samples were similarly organized within the perceptual space in both modalities. Moreover, in both senses the first two principal components were dominated by hardness and roughness. These are two material features that are fundamental for the haptic sense. We conclude that although the haptic sense seems to be crucial for material perception, the information it can gather alone might not be quite fine-grained and rich enough for perfect material recognition.

Visual and Haptic Representations of Material Properties

in Multisensory Research

Sections

References

AmediA.JacobsonG.HendlerT.MalachR.ZoharyE. (2002). Convergence of visual and tactile shape processing in the human lateral occipital complexCereb. Cortex 1212021212.

AmediA.von KriegsteinK.van AtteveldtN. M.BeauchampM. S.NaumerM. J. (2005). Functional imaging of human crossmodal identification and object recognitionExp. Brain Res. 166559571.

Bergmann TiestW. M.KappersA. M. L. (2007). Haptic and visual perception of roughnessActa Psychol. 124177189.

BhushanN.RaoA. R.LohseG. L. (1997). The texture lexicon: Understanding the categorization of visual texture terms and their relationship to texture imagesCognit. Sci. 21219246.

BrodatzP. (1966). Textures. DoverNew York, NY, USA.

BuckinghamG.CantJ. S.GoodaleM. A. (2009). Living in a material world: How visual cues to material properties affect the way that we lift objects and perceive their weightJ. Neurophysiol. 10231113118.

CantJ. S.GoodaleM. A. (2011). Scratching beneath the surface: New insights into the functional properties of the lateral occipital area and parahippocampal place areaJ. Neurosci. 3182488258.

CookeT.JäkelF.WallravenC.BulthoffH. H. (2007). Multimodal similarity and categorization of novel, three-dimensional objectsNeuropsychologia 45484495.

FlemingR. W. (2012). Human perception: Visual heuristics in the perception of glossinessCurr. Biol. 22865866.

FlemingR. W.DrorR. O.AdelsonE. H. (2003). Real-world illumination and the perception of surface reflectance propertiesJ. Vision 3(5) 3.

FlemingR. W.WiebelC. B.GegenfurtnerK. R. (2013). Perceptual qualities and material classesJ. Vision 13(8) 9.

GaissertN.BülthoffH. H.WallravenC. (2011). Similarity and categorization: From vision to touchActa Psychol. 138219230.

GaissertN.WallravenC. (2012). Categorizing natural objects: A comparison of the visual and the haptic modalitiesExp. Brain Res. 216123134.

GaissertN.WallravenC.BülthoffH. H. (2010). Visual and haptic perceptual spaces show high similarity in humansJ. Vision 10(11) 2.

GedG.ObeinG.SilvestriZ.Le RohellecJ.ViénotF. (2010). Recognizing real materials from their glossy appearanceJ. Vision 10(9) 18.

GieselM.GegenfurtnerK. R. (2010). Color appearance of real objects varying in material, hue and shapeJ. Vision 10(9) 10.

HiramatsuC.GodaN.KomatsuH. (2011). Transformation from image-based to perceptual representation of materials along the human ventral visual pathwayNeuroimage 57482494.

HoY.-X.LandyM. S.MaloneyL. T. (2006). How direction of illumination affects visually perceived surface roughnessJ. Vision 6(5) 9.

HollinsM.BensmaïaS.KarlofK.YoungF. (2000). Individual differences in perceptual space for tactile textures: evidence from multidimensional scalingPercept. Psychophys. 6215341544.

HollinsM.FaldowskiR.RaoS.YoungF. (1993). Perceptual dimensions of tactile surface texture: a multidimensional scaling analysisPercept. Psychophys. 54697705.

KimJ.AndersonB. L. (2010). Image statistics and the perception of surface gloss and lightnessJ. Vision 10(9) 3.

LedermanS. J.AbbottS. G. (1981). Texture perception: studies of intersensory organization using a discrepancy paradigm and visual versus tactual psychophysicsJ. Exp. Psychol. 7902915.

LiuC.SharanL.AdelsonE. H.RosenholtzR. (2010). Exploring features in a Bayesian framework for material recognition Computer Vision and Pattern Recognition (CVPR)2010 IEEE Conference San Francisco CA pp. 239–246.

MotoyoshiI. (2010). Highlight-shading relationship as a cue for the perception of translucent and transparent materialsJ. Vision 106.

MotoyoshiI.NishidaS.SharanL.AdelsonE. H. (2007). Image statistics and the perception of surface qualitiesNature 447206209.

OkamotoS.NaganoH.YamadaY. (2013). Psychophysical dimensions of tactile perception of texturesJ. IEEE Trans. Haptics 68193.

OlkkonenM.BrainardD. H. (2010). Perceived glossiness and lightness under real-world illuminationJ. Vision 105.

OlkkonenM.WitzelC.HansenT.GegenfurtnerK. R. (2010). Categorical color constancy for real surfacesJ. Vision 10(9) 16.

PicardD.DacremontC.ValentinD.GiboreauA. (2003). Perceptual dimensions of tactile texturesActa Psychol. 114165184.

RaoA. R.LohseG. L. (1996). Towards a texture naming system: Identifying relevant dimensions of textureVision Res. 3616491669.

SharanL.RosenholtzR.AdelsonE. H. (2009). What can you see in a brief glance? J. Vision 9784.

StillaR.SathianK. (2008). Selective visuo-haptic processing of shape and textureHum. Brain Mapp. 2911231138.

WhitakerT. A.Simões-FranklinC.NewellF. N. (2008). Vision and touch: Independent or integrated systems for the perception of texture? Brain Res. 12425972.

WiebelC. B.ValsecchiM.GegenfurtnerK. R. (2013). The speed and accuracy of material recognition in natural imagesAtten. Percept. Psychophys. 75954966.

Figures

  • View in gallery

    Example images taken from our seven material categories. Top row, from left to right: metal, stone, leather, fur. Bottom row, from left to right: fabric, paper, plastic, and wood. This figure is published in colour in the online version.

  • View in gallery

    Histogram of the correlation coefficients between all 12 participants across all materials and dimensions tested. Data for the visual and the haptic modality are shown on the left and the right side, respectively. Each histogram consists of 66 correlation coefficients. For both modalities all of these correlations are highly significant (p<0.001).

  • View in gallery

    Correlation matrices between material properties across the different material classes and participants. Ratings on each property dimension were averaged over all 12 participants for each stimulus separately. The left side shows data for the visual modality, the right side shows data for the haptic modality. Significant correlations are indicated by a dot. White numbers indicate negative correlation coefficients, black numbers indicate positive correlation coefficients.

  • View in gallery

    Correlation matrices for the different material classes, calculated across material properties and participants. Ratings for each stimulus on each property dimension were averaged across observers. The left side shows data for the visual modality, the right side shows data for the haptic modality. Significant correlations are indicated by a dot. White numbers indicate negative correlation coefficients, black numbers indicate positive correlation coefficients.

  • View in gallery

    Representation of the different material classes based on the visual and haptic material property ratings within a two-dimensional PCA space. PCAs were performed based on the z-standardized property ratings for each stimulus averaged across participants.

  • View in gallery

    Scree plots of the PCAs we performed on the visual and haptic material ratings.

  • View in gallery

    Correlations between the property ratings of each material sample in the visual and the haptic condition for each material quality. Correlations for each material property were calculated on the ratings given to each stimulus by each participant in both modalities. Significant correlations are indicated by an asterisk (p0.001).

  • View in gallery

    Procrustes analysis between the visual and the remapped haptic data. Lines indicate the distance between the stimulus locations in the visual space and the remapped haptic space.

  • View in gallery

    Procrustes analysis between the material category cluster centers of the PCA solutions for the visual (filled symbols) and the remapped haptic (open symbols) data. This figure is published in colour in the online version.

  • View in gallery

    Procrustes analyses of the PCAs derived from all participants’ data from the first half (six participants’ visual data and six participants’ haptic data) and the second half of the rating procedure (six participants’ haptic data [open symbols] and six participants’ visual data [filled symbols]). This figure is published in colour in the online version.

  • View in gallery

    Mean correlation coefficients for each property rated visually. Data are aggregated across material samples and split in two groups: Participants that started with the visual modality (black) and participants that started with the haptic modality (grey).

  • View in gallery

    Mean correlation coefficients for each property rated haptically. Data are aggregated across material samples and split into two groups: Participants that started with the haptic modality (black) and participants that started with the visual modality (grey).

  • View in gallery

    Categorization and classification results for the haptic modality: the plots in the left column show the categorization data collapsed across five observers as well as the mean categorization accuracy for each material class separately. The plots in the right column show the classifications of the machine learning algorithm based on participants’ property ratings of our stimuli.

  • View in gallery

    Sensitivity measures (d) for visual and haptic categorization. Small symbols represent a single participant’s d, large symbols represent the mean d of five participants. Significant differences between the visual and the haptic condition are indicated by an asterisk (p0.05, Bonferroni corrected).

  • View in gallery

    Feature weights for the classification of four participants’ categorizations (averaged across participants) and ‘true’ categories.

Information

Content Metrics

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 50 50 21
Full Text Views 61 61 44
PDF Downloads 5 5 1
EPUB Downloads 0 0 0