Moving to the Beat: Studying Entrainment to Micro-Rhythmic Changes in Pulse by Motion Capture

in Timing & Time Perception
Restricted Access
Get Access to Full Text
Rent on DeepDyve

Have an Access Token?

Enter your access token to activate and access content online.

Please login and go to your personal user account to enter your access token.


Have Institutional Access?

Access content through your institution. Any other coaching guidance?


Pulse is a fundamental reference for the production and perception of rhythm. In this paper, we study entrainment to changes in the micro-rhythmic design of the basic pulse of the groove in ‘Left & Right’ by D’Angelo. In part 1 of the groove the beats have one specific position; in part 2, on the other hand, the different rhythmic layers specify two simultaneous but alternative beat positions that are approximately 50-80 ms apart. We first anticipate listeners’ perceptual response using the theories of entrainment and dynamic attending as points of departure. We then report on a motion capture experiment aimed at engaging listeners’ motion patterns in response to the two parts of the tune. The results show that when multiple onsets are introduced in part 2, the half note becomes a significant additional level of entrainment and the temporal locations of the perceived beats are drawn towards the added onsets.

Moving to the Beat: Studying Entrainment to Micro-Rhythmic Changes in Pulse by Motion Capture

in Timing & Time Perception



BengtssonI. & GabrielssonA. (1983). Analysis and synthesis of musical rhythm. In SundbergJ. (Ed.) Studies in Music Performance (pp. 2760). Stockholm, Sweden: Royal Swedish Academy of Music.

BurgerB. & ToiviainenP. (2013). MoCap Toolbox—A Matlab toolbox for computational analysis of movement data. In BresinR. (Ed.) Proceedings of the 10th Sound and Music Computing Conference (SMC) (pp. 172178). Stockholm, Sweden: KTH Royal Institute of Technology.

ChenJ. L.PenhuneV. B. & ZatorreR. J. (2008). Listening to musical rhythms recruits motor regions of the brain. Cereb. Cortex1828442854.

ChernoffJ. M. (1979). African Rhythm and African Sensibility: Aesthetics and Social Action in African Musical Idioms. Chicago, IL, USA: University of Chicago Press.

ClarkeE. F. (1985). Structure and expression in rhythmic performance. In HowellP.CrossI. & WestR. (Eds) Musical Structure and Cognition (pp. 209236). London, UK: Academic Press.

ClarkeE. F. (1987). Categorical rhythm perception: An ecological perspective. In GabrielssonA. (Ed.) Action and Perception in Rhythm and Music (pp. 1933). Stockholm, Sweden: Royal Swedish Academy of Music.

ClarkeE. F. (1989). The perception of expressive timing in music. Psychol. Res.5129.

ClarkeE. F. (2005). Ways of Listening. Oxford, UK: Oxford University Press.

ClaytonM.SagerR. & WillU. (2004). In time with the music: The concept of entrainment and its significance for ethnomusicology. European Meetings in Ethnomusicology 11 (ESEM Counterpoint 1)182.

ClaytonM. & LeanteL. (2013). Embodiment in music performance. In ClaytonM.DueckB. & LeanteL. (Eds) Experience and Meaning in Music Performance(pp.188208).Oxford, UK: Oxford University Press.

CooperG. & MeyerL. B. (1963). The Rhythmic Structure of Music. Chicago, IL, USA: University of Chicago Press.

DanielsenA. (2006). Presence and Pleasure: The Funk Grooves of James Brown and Parliament. Middletown, CT, USA: Wesleyan University Press.

DanielsenA. (2010a). Introduction. In DanielsenA. (Ed.) Musical Rhythm in the Age of Digital Reproduction (pp. 118). Farnham, UK: Ashgate.

DanielsenA. (2010b). Here, there and everywhere: Three accounts of pulse in D’Angelo’s ‘Left & Right’. In DanielsenA. (Ed.) Musical Rhythm in the Age of Digital Reproduction (pp. 1936). Farnham, UK: Ashgate.

DeleuzeG. (1994). Difference and Repetition. London, UK: Athlone Press.

DesainP. & HoningH. (2003). The formation of rhythmic categories and metric priming. Perception32341366.

ElliottM. T.WingA. M. & WelchmanA. E. (2014). Moving in time: Bayesian causal inference explains movement coordination to auditory beats. Proc. R. Soc. B281: 20140751.

FribergA. & SundbergJ. (1995). Time discrimination in a monotonic, isochronous sequence. J. Acoust. Soc. Am.9825242531.

FujiokaT.TrainorL.LargeE. & RossB. (2009). Beta and gamma rhythms in human auditory cortex during musical beat processing. Ann. N. Y. Acad. Sci.11698992.

GibsonJ. (1986). The Ecological Approach to Visual Perception (2nd ed.). Hillsdale, NJ, USA: Lawrence Erlbaum Associates.

HimbergT. (2014). Interaction in Musical Time. PhD thesis University of CambridgeUK.

HoveM.KellerP. & KrumhanslC. (2007). Sensorimotor synchronization with chords containing tone-onset asynchronies. Atten. Percept. Psychophys.69699708.

IyerV. (2002). Embodied mind, situated cognition, and expressive microtiming in African-American music. Music Percept.19387414.

JonesM. R. (1976). Time, our lost dimension: toward a new theory of perception, attention, and memory. Psychol. Rev.83323355.

JonesM. R. (2004). Attention and timing. In NeuhoffJ. G. (Ed.) Ecological Psychoacoustics (pp. 4859). London, UK: Academic Press.

JonesM. R. & BoltzM. (1989). Dynamic attending and responses to time. Psychol. Rev.96459491.

KeilC. (1995). The theory of participatory discrepancies: A progress report. Ethnomusicology39119.

KvifteT. (2004). Description of grooves and syntax/process dialectics. Stud. Musicol. Norv.305477.

LargeE. W. (2000). On synchronizing movements with music. Hum. Mov. Sci.19527566.

LargeE. W. (2008). Resonating to musical rhythm: Theory and experiment. In GrondinS. (Ed.) Psychology of Time (pp. 189232). Bingley, UK: Emerald.

LargeE. W. & JonesM. R. (1999). The dynamics of attending: How people track time-varying events. Psychol. Rev.106119159.

LargeE. W. & KolenJ. F. (1994). Resonance and the perception of musical meter. Conn. Sci.6177208.

LemanM. (2008). Embodied Music Cognition and Mediation Technology. Cambridge, MA, USA: MIT Press.

LerdahlF. & JackendoffR. (1983). A Generative Theory of Tonal Music. Cambridge, MA, USA: MIT Press.

LondonJ. (2012). Hearing in Time: Psychological Aspects of Musical Meter (2nd ed.). Oxford, UK: Oxford University Press.

LuckG. &SlobodaJ. A. (2009). Spatio-temporal cues for visually mediated synchronization. Music Percept.26465473.

NketiaJ. H. K. (1974). The Music of Africa. New York, NY, USA: W. W. Norton.

NozaradanS.PeretzI.MissalM. & MourauxA. (2011). Tagging the neuronal entrainment to beat and meter. J. Neurosci.311023410240.

PovelJ. D. & EssensP. J. (1985). Perception of temporal patterns. Music Percept.2411440.

PröglerJ. A. (1995). Searching for swing: Participatory discrepancies in the jazz rhythm section. Ethnomusicology392154.

ReppB. H. (2005). Sensorimotor synchronization: A review of the tapping literature. Psychonom. Bull. Rev.12969992.

ReppB. H. & SuY.-H. (2013). Sensorimotor synchronization: A review of recent research (2006–2012). Psychonom. Bull. Rev.20403452.

ShoveP. & ReppB. H. (1995). Musical motion and performance: Theoretical and empirical perspectives. In RinkJ. (Ed.) The Practice of Performance: Studies in Musical Interpretation (pp. 5583). Cambridge, UK: Cambridge University Press.

SnyderJ. & LargeE. (2005). Gamma-band activity reflects the metric structure of rhythmic tone sequences. Cogn. Brain Res.24117126.

SuY.-H. (2014). Peak velocity as a cue in audiovisual synchrony perception of rhythmic stimuli. Cognition131330344.

SuY.-H. & PöppelE. (2012). Body movement enhances the extraction of temporal structures in auditory sequences. Psychol. Res.76373382.

ToiviainenP.LuckG. & ThompsonM. (2009). Embodied metre: Hierarchical eigenmodes in spontaneous movement to music. Cogn. Proc.10325327.

ToiviainenP.LuckG. & ThompsonM. R. (2010). Embodied meter: Hierarchical eigenmodes in music-induced movement. Music Percept.285970.

Van DyckE.MoelantsD.DemeyM.DeweppeA.CoussementP. & LemanM. (2013). The impact of the bass drum on human dance movement. Music Percept.30349359.

D’Angelo (2000). Voodoo. Virgin Records.


  • View in gallery

    Waveform display (amplitude/time) of bar 14 of ‘Left & Right’. Beat onsets are indicated by black vertical lines. The virtual beat position at beat 2 is placed one sixteenth after the actual onset of the syncopated guitar, which is indicated by stippled line. The time refers to the placement of the clip within the sound file prepared for the motion capture experiments (see below).

  • View in gallery

    Basic rhythmic structure of guitar layer in Part 1.

  • View in gallery

    Basic rhythmic structures of guitar layer and drum kit layer in part 2. The quarter-note pulse implied by the guitar is located fifty to eighty ms later in time than the pulse implied by the bass drum and snare drum.

  • View in gallery

    Transition to part 2, from the mismatch between a point-like expectation and actual rhythmic events (left) to a widened attentional focus—a ‘beat bin’ that encompasses the multiple onsets (right).

  • View in gallery

    The experimental setup in the motion capture lab, with the four participants standing back-to-back with sticks in their hands (left) and a close-up of a stick with reflective markers attached (right). This figure is published in color in the online version.

  • View in gallery

    Waveform representation of the musical examples (amplitude/time): (i) test clip (looped excerpt of a different track from the Voodoo album) (ii) looped four-bar excerpt of part 1 of the original groove, (iii) original groove (thirty seconds from the beginning of the song, including the transition from part 1 to part 2), (iv) looped four-bar excerpt of part 2 of the original groove, (v) metronome clicks in the same tempo as the original groove.

  • View in gallery

    Combined plot of the motion of stick markers for the five sound clips for each of the 20 subjects individually (gray line) and median value of all subjects (black line), calculated as the first derivative of the vector length (norm) of the motion. The entrance of the drum kit layer in the original groove (stimulus iii) is marked by the dotted line.

  • View in gallery

    Examples of spectra showing typical cases for the different periodicity categories. Clear frequency peaks (excellent), partly visible peaks (marginal) and no obvious peaks (poor).

  • View in gallery

    Examples of regular (subject 1) and irregular vertical motion (subject 2). Estimated beat positions corresponding to beat onsets in the music are indicated by stippled lines.

Index Card

Content Metrics

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 28 28 18
Full Text Views 8 8 8
PDF Downloads 1 1 1
EPUB Downloads 0 0 0