Search Results

Research has revealed different temporal integration windows between and within different speech-tokens. The limited speech-tokens tested to date has not allowed for the proper evaluation of whether such differences are task or stimulus driven? We conducted a series of experiments to investigate how the physical differences associated with speech articulation affect the temporal aspects of audiovisual speech perception. Videos of consonants and vowels uttered by three speakers were presented. Participants made temporal order judgments (TOJs) regarding which speech-stream had been presented first. The sensitivity of participants’ TOJs and the point of subjective simultaneity (PSS) were analyzed as a function of the place, manner of articulation, and voicing for consonants, and the height/backness of the tongue and lip-roundedness for vowels. The results demonstrated that for the case of place of articulation/roundedness, participants were more sensitive to the temporal order of highly-salient speech-signals with smaller visual-leads at the PSS. This was not the case when the manner of articulation/height was evaluated. These findings suggest that the visual-speech signal provides substantial cues to the auditory-signal that modulate the relative processing times required for the perception of the speech-stream. A subsequent experiment explored how the presentation of different sources of visual-information modulated such findings. Videos of three consonants were presented under natural and point-light (PL) viewing conditions revealing parts, or the whole, face. Preliminary analysis revealed no differences in TOJ accuracy under different viewing conditions. However, the PSS data revealed significant differences in viewing conditions depending on the speech token uttered (e.g., larger visual-leads for PL-lip/teeth/tongue-only views).

In: Seeing and Perceiving
In: Timing & Time Perception
Temporal Processing in Clinical Populations
Volume Editors: Argiro Vatakis and Melissa Allman
Time Distortions in Mind brings together current research on aspects of temporal processing in clinical populations, in the ultimate hope of elucidating the interdependence between perturbations in timing and disturbances in the mind and brain. Such research may inform not only typical psychological functioning, but may also elucidate the psychological consequences of any pathophysiological differences in temporal processing.
This collection of current knowledge on temporal processing in clinical populations is an excellent reference for the student and scientist interested in the topic, but it also serves as the stepping-stone to share ideas and push forward the advancement in understanding how distorted timing can lead to a disturbed brain and mind or vice versa.

Contributors to this volume: Ryan D. Ward, Billur Avlar, Peter D Balsam, Deana B. Davalos, Jamie Opper, Yvonne Delevoye-Turrell, Hélène Wilquin, Mariama Dione, Anne Giersch, Laurence Lalanne, Mitsouko van Assche, Patrick E. Poncelet, Mark A. Elliott, Deborah L. Harrington, Stephen M. Rao, Catherine R.G. Jones, Marjan Jahanshahi, Bon-Mi Gu, Anita J. Jurkowski, Jessica I. Lake, Chara Malapani, Warren H. Meck, Rebecca M. C. Spencer, Dawn Wimpory, Brad Nicholas, Elzbieta Szelag, Aneta Szymaszek, Anna Oron, Melissa J. Allman, Christine M. Falter, Argiro Vatakis, Alexandra Elissavet Bakou
In: Timing and Time Perception: Procedures, Measures, & Applications
In: Time Distortions in Mind
Editors-in-Chief: Argiro Vatakis, Hedderik van Rijn, and Fuat Balcı
Timing is ever-present in our everyday life – from the ringing sounds of the alarm clock to our ability to walk, dance, remember, and communicate with others. This intimate relationship has lead scientists from different disciplines to investigate time and to explore how individuals perceive, process, and effectively use timing in their daily activities.
Timing & Time Perception aims to be the forum for all psychophysical, neuroimaging, pharmacological, computational, and theoretical advances on the topic of timing and time perception in humans and other animals. We envision a multidisciplinary approach to the topics covered, including the synergy of: Neuroscience and Philosophy for understanding the concept of time, Cognitive Science and Artificial Intelligence for adapting basic research to artificial agents, Psychiatry, Neurology, Behavioral and Computational Sciences for neuro-rehabilitation and modeling of the disordered brain, to name just a few.
Given the ubiquity of interval timing, this journal will host all basic studies, including interdisciplinary and multidisciplinary works on timing and time perception and serve as a forum for discussion and extension of current knowledge on the topic.

 Click on title to see all prices

We often use tactile-input in order to recognize familiar objects and to acquire information about unfamiliar ones. We also use our hands to manipulate objects and utilize them as tools. However, research on object affordances has mainly been focused on visual-input and, thus, limiting the level of detail one can get about object features and uses. In addition to the limited multisensory-input, data on object affordances has also been hindered by limited participant input (e.g., naming task). In order to address the above mention limitations, we aimed at identifying a new methodology for obtaining undirected, rich information regarding people’s perception of a given object and the uses it can afford without necessarily viewing the particular object. Specifically, 40 participants were video-recorded in a three-block experiment. During the experiment, participants were exposed to pictures of objects, pictures of someone holding the objects, and the actual objects and they were allowed to provide unconstrained verbal responses on the description and possible uses of the stimuli presented. The stimuli presented were lithic tools given the: novelty, man-made design, design for specific use/action, and absence of functional knowledge and movement associations. The experiment resulted in a large linguistic database, which was linguistically analyzed following a response-based specification. Analysis of the data revealed significant contribution of visual- and tactile-input in naming and definition of object-attributes (color/condition/shape/size/texture/weight), while no significant tactile-information was obtained for object-features of material, visual-pattern, and volume. Overall, this new approach highlights the importance of multisensory-input in the study of object affordances.

In: Seeing and Perceiving

Our timing estimates are often prone to distortions from non-temporal attributes such as the direction of motion. Motion direction has been reported to lead to interval dilation when the movement is toward (i.e., looming) as compared to away from the viewer (i.e., receding). This perceptual asymmetry has been interpreted based on the contextual salience and prioritization of looming stimuli that allows for timely reactions to approaching objects. This asymmetry has mainly been studied through abstract stimulation with minimal social relevance. Focusing on the latter, we utilized naturalistic displays of biological motion and examined the aforementioned perceptual asymmetry in the temporal domain. In Experiment 1, we tested visual looming and receding human movement at various intervals in a reproduction task and found no differences in the participants’ timing estimates as a function of motion direction. Given the superiority of audition in timing, in Experiment 2, we combined the looming and receding visual stimulation with sound stimulation of congruent, incongruent, or no direction information. The analysis showed an overestimation of the looming as compared to the receding visual stimulation when the sound presented was of congruent or no direction, while no such difference was noted for the incongruent condition. Both looming and receding conditions (congruent and control) led to underestimations as compared to the physical durations tested. Thus, the asymmetry obtained could be attributed to the potential perceptual negligibility of the receding stimuli instead of the often-reported salience of looming motion. The results are also discussed in terms of the optimality of sound in the temporal domain.

In: Timing & Time Perception

Our research project aimed at investigating multisensory temporal integration in synesthesia and explore whether or not there are commonalities in the sensory experiences of synesthetes and non-synesthetes. Specifically, we investigated whether or not synesthetes are better integrators than non-synesthetes by examining the strength of multisensory binding (i.e., the unity effect) using an unspeeded temporal order judgment task. We used audiovisual stimuli based on grapheme-colour synesthetic associations (Experiment 1) and on crossmodal correspondences (e.g., high-pitch — light colours; Experiment 2) presented at various stimulus onset asynchronies (SOAs) with the method of constant stimuli. Presentation of these stimuli in congruent and incongruent format allowed us to examine whether congruent stimuli lead to a stronger unity effect than incongruent ones in synesthetes and non-synesthetes and, thus, whether synesthetes experience enhanced multisensory integration than non-synesthetes. Preliminary data support the hypothesis that congruent crossmodal correspondences lead to a stronger unity effect than incongruent ones in both groups, with this effect being stronger in synesthetes than non-synesthetes. We also found that synesthetes experience stronger unity effect when presented with idiosyncratically congruent grapheme-colour associations than in incongruent ones as compared to non-synesthetes trained in certain grapheme-colour associations. Currently, we are investigating (Experiment 3) whether trained non-synesthetes exhibit enhanced integration when presented with synesthetic associations that occur frequently among synesthetes. Utilizing this design we will provide psychophysical evidence of the multisensory integration in synesthesia and the possible common processing mechanisms in synesthetes and non-synesthetes.

In: Seeing and Perceiving