news    views    talk    learn    |    about    contribute     republish     crowdfunding     archives     calls/events

Tag : music

by   -   November 9, 2012

12-0205-r

The Gocen is a device which scans and plays handwritten sheet music in real time. It is being developed by a group at the Tokyo Metropolitan University led by Assistant Professor Tetsuaki Baba.

“First, the system looks at the stave, then at the notes, then at the position of the notes, to determine the high notes. In addition, it directly reads words such as piano or guitar. The computer automatically recognizes them, and changes the instrument. Also, for example, if this melody is in F minor, rather than C major, when the system reads the letters Fm, it has the ability to add four flats.”

by   -   October 10, 2012

This robotic installation actually plays the whole song 40°42’48.46 N 73°58’18.38 by German rock/electronic artist Bonaparte. Filmed by French team Jul & Mat, the installation is the brain child of Peter Cocteau, who has already showed the world that LEGO’s Mindstorms platform can become a fantastic drum machine, in his brilliant NXT-606. [From an article by Peter Kirn]

by   -   October 3, 2012

It’s always interesting to see robot creations from many years ago. You can learn a lot from some old ideas, and what people did when they didn’t have the technology (or access to it) that we have today. This robot here is a flute playing robot. In the video they point out all of its degrees of freedom. Also worthy of note are its lungs.

by   -   June 29, 2012

Tovbot’s Shimi made its first public appearance two days ago at Google I/O, where not just one but three Shimis performed in perfect coordination. Tovbot was formed earlier this year by a group of robot researchers and entrepreneurs hailing from Georgia Tech, IDC in Israel, and MIT Media Lab. [Their] goal is to foster a new paradigm of personal robots – robots that don’t just clean your floors or your pool, but also interact with you on a personal, almost human level. According to a news item on Georgia Tech’s website, Shimi, a musical companion developed by Georgia Tech’s Center for Music Technology, recommends songs, dances to the beat and keeps the music pumping based on listener feedback. Automaton has more detail.

by   -   November 30, 2011

Shimon is an interactive robotic marimba player that can improvise both music and choreography in real-time to the melody of a human pianist.

Playing an instrument does not make you a musician. To become a musician you need to listen, analyze, improvise, and interact through the sound you produce and your body language.

With this in mind, Hoffman et al. explore robotic musicianship. Unlike robots that simply perform a sequence of notes, Shimon’s performances are composed of a sequence of gestures that may or may not produce sound. Using gestures as the building blocks of musical expression is particularly appropriate for robotic musicianship, and nicely fits with our embodied view of human-robot interaction.

The robot is able to improvise by following basic aspects of standard Jazz joint improvisation and can anticipate gestures to easily synchronize with duet partners. Building on this, the human and robot could perform three types of interactions. In the first interaction, the robot and human played two distinct musical phrases, where the second phrase is a commentary on the first phrase. The second interaction was centered around the choreographic aspect of movement with the notes appearing as a “side-effect” of the performance. The third interaction was a rhythmic phrase-matching improvisation.

Using this improvisation system, the pair performed full-length performances of nearly 7 minutes in front of live public audiences and more than 70’000 online viewers.

After the live performances, additional experiments were conducted to investigate the importance of physical embodiment and visual contact in Robotic Musicianship. Results show that synchronization between the robot and musician can be aided by visual contact when the tempo is uncertain and slow. In addition, the audience perceives Shimon as playing better, more like a human, as more responsive, and even more inspired when compared to a “computer musician”. Shimon was also rated as better synchronized, more coherent, communicating, and coordinated; and the human as more inspired and more responsive.

In the future, Hoffman et al. hope to further explore robot musicianship by giving Shimon a socially expressive robot head, vision and new gestures.