Robohub.org
 

Translating music into light and motion with robots


by
25 February 2026



share this:

Image taken from the YouTube video created by the authors (see below).

A system developed by researchers at the University of Waterloo lets people collaborate with groups of robots to create works of art inspired by music.

The new technology features multiple wheeled robots about the size of soccer balls that trail coloured light as they move within a fixed area on the floor in response to key features of music including tempo and chord progression.

A camera records the co-ordinated light trails as they snake within that area, which serves as the canvas for the creation of a “painting,” or visual representation of the emotional content of a particular piece of music.

“Basically, we programmed a swarm of robots to paint based on musical input,” said Dr Gennaro Notomista, a professor of electrical and computer engineering at Waterloo.

“The result is a cohesive system that not only processes musical input, but also co-ordinates multiple painting robots to create adaptive, expressive art that reflects the emotional essence of the music being played.”

The robots represent emotion as they “listen” to music via the colours, intensity and width of their lights trails, as well as their position on the canvas and the speed with which they move within it.

People can simultaneously influence a painting in progress using controls to change the width of light trails and their location on the virtual canvas.

“We included the human control input to allow people and robots to work together,” said Notomista, whose interests include the intersection of art and technology. “The human painter should complement and be complemented by what the robots do.”

The first challenge for researchers was developing an algorithm to control multiple robots within a given area. They tested the system with up to 12 robots, but it can be scaled to handle any number.

Step two involved creating technology to extract and analyze musical features that express emotion so they can then be translated into light trails that appropriately represent them.

Lessons learned during the project have potential applications in other areas requiring the control and co-ordination of multiple robots working in unison, such as environmental monitoring, precision agriculture, search and rescue missions, and planetary exploration.

The research also reflects the University of Waterloo’s Global Futures initiative, which advances interdisciplinary work that considers how emerging technologies can shape society, culture and the human experience.

Later, Notomista plans to enlist professional painters and musicians to explore the possibilities of the new tool in user studies and stage public exhibitions.

A paper on the system, Music-driven Robot Swarm Painting, by Notomista and Jingde Cheng, a former Waterloo graduate student, was presented at the 2025 IEEE International Conference on Advanced Robotics and its Social Impacts.




University of Waterloo

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.

Sony AI table tennis robot outplays elite human players

  22 Apr 2026
New robot and AI system has beaten professional and elite table tennis players.

AI system learns to keep warehouse robot traffic running smoothly

  20 Apr 2026
This new approach adapts to decide which robots should get the right of way at every moment, avoiding congestion and increasing throughput.

Robot Talk Episode 152 – Dexterous robot hands, with Rich Walker

  17 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Rich Walker from Shadow Robot Company about their advanced robotic hands for research and industry.

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

and   14 Apr 2026
Ross King created the first robot scientist back in 2009. He spoke to us about the nature of scientific discovery, the role AI has to play, and his recent work in DNA computing.

Robot Talk Episode 151 – Robots to study the ocean, with Simona Aracri

  10 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Simona Aracri from National Research Council of Italy about innovative robot designs for oceanography and environmental monitoring.

Generative AI improves a wireless vision system that sees through obstructions

  08 Apr 2026
With this new technique, a robot could more accurately detect hidden objects or understand an indoor scene using reflected Wi-Fi signals.

Resource-constrained image generation and visual understanding: an interview with Aniket Roy

  07 Apr 2026
Aniket tells us about his research exploring how modern generative models can be adapted to operate efficiently while maintaining strong performance.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence