Frédéric Bevilacqua, leader of the Real Time Musical Interactions team at IRCAM — the Paris-based Institute for Music/Acoustic Research and Coordination — will share his current research in gesture analysis and software development, including real-time demonstrations of potential applications in music and dance.
Bevilacqua and his colleagues are working on new paradigms for music performance and the interaction between gesture and sound processes. His most recent research involves the development of new interfaces for music expression, gesture analysis for performing arts, and music pedagogy.
Bevilacqua works at the intersection of the scientific analysis of movement, the engineering of creative interfaces, and artistic collaboration. He is regularly invited to participate in the development of artistic projects that make use of real-time motion capture data to generate other elements of the piece, such as the movement of a dancer driving the sound. His team created a software system called the “gesture follower,” that can learn and recall specific gestures performed live by a dancer, allowing for the system to be able to recognize and react to a learned sequence of movement. The “gesture follower” system will be demonstrated live with a dancer.
The workshop will also feature the violinist/composer Mari Kimura who will demonstrate different features of the “augmented violin.” Mari Kimura will wear a custom fitted “augmented violin glove,” for easier wearability and elasticity for performance. They will explain how the “gesture follower” can be used in this scenario and Kimura will play some musical excerpts of pieces she composed using the “augmented violin.”
Frédéric Bevilacqua is the leader of the Real Time Musical Interactions team at IRCAM – Institute for Music/Acoustic Research and Coordination in Paris. In 1991 he obtained a degree in physics and then in 1998, a PhD in Biomedical Optics from the Ecole Polytechnique Fédérale de Lausanne, Switzerland. He also studied music at the Berklee College of Music in Boston (1992–1993) and has collaborated extensively with artists working at the intersection of media, installation and performance. From 1999 to 2003 he conducted research at the Beckman Laser Institute at the University of California Irvine, including software design to map motion capture data to sound generation. Since October 2003 he has been in charge of research on gesture analysis at Ircam as a member of the Real-time Musical Interactions Team, and with the Performing Arts Technology Research Team.