Robohub.org
 

Robotic musicianship


by
30 November 2011



share this:

Shimon is an interactive robotic marimba player that can improvise both music and choreography in real-time to the melody of a human pianist.

Playing an instrument does not make you a musician. To become a musician you need to listen, analyze, improvise, and interact through the sound you produce and your body language.

With this in mind, Hoffman et al. explore robotic musicianship. Unlike robots that simply perform a sequence of notes, Shimon’s performances are composed of a sequence of gestures that may or may not produce sound. Using gestures as the building blocks of musical expression is particularly appropriate for robotic musicianship, and nicely fits with our embodied view of human-robot interaction.

The robot is able to improvise by following basic aspects of standard Jazz joint improvisation and can anticipate gestures to easily synchronize with duet partners. Building on this, the human and robot could perform three types of interactions. In the first interaction, the robot and human played two distinct musical phrases, where the second phrase is a commentary on the first phrase. The second interaction was centered around the choreographic aspect of movement with the notes appearing as a “side-effect” of the performance. The third interaction was a rhythmic phrase-matching improvisation.

Using this improvisation system, the pair performed full-length performances of nearly 7 minutes in front of live public audiences and more than 70’000 online viewers.

After the live performances, additional experiments were conducted to investigate the importance of physical embodiment and visual contact in Robotic Musicianship. Results show that synchronization between the robot and musician can be aided by visual contact when the tempo is uncertain and slow. In addition, the audience perceives Shimon as playing better, more like a human, as more responsive, and even more inspired when compared to a “computer musician”. Shimon was also rated as better synchronized, more coherent, communicating, and coordinated; and the human as more inspired and more responsive.

In the future, Hoffman et al. hope to further explore robot musicianship by giving Shimon a socially expressive robot head, vision and new gestures.



tags: ,


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence