Robohub.org
 

Katsushi Ikeuchi: e-Intangible Heritage | CMU RI Seminar


by
26 February 2017



share this:

Link to video on YouTube

Abstract: “Tangible heritage, such as temples and statues, is disappearing day-by-day due to human and natural disaster. In e-tangible heritage, such as folk dances, local songs, and dialects, has the same story due to lack of inheritors and mixing cultures. We have been developing methods to preserve such tangible and in-tangible heritage in the digital form. This project, which we refer to as e-Heritage, aims not only record heritage, but also analyzes those recorded data for better understanding as well as displays those data in new forms for promotion and education. This talk consists of three parts. The first part briefly covers e-Tangible heritage, in particular, our projects in Cambodia and Kyushu. Here I emphasize not only challenge in data acquisition but also the importance to create the new aspect of science, Cyber-archaeology, which allows us to have new findings in archaeology, based on obtained digital data. The second part covers how to display a Japanese folk dance by the performance of a humanoid robot. Here, we follow the paradigm, learning-from-observation, in which a robot learns how to perform a dance from observing a human dance performance. Due to the physical difference between a human and a robot, the robot cannot exactly mimic the human actions. Instead, the robot first extracts important actions of the dance, referred to key poses, and then symbolically describes them using Labanotation, which the dance community has been using for recording dances. Finally, this labanotation is mapped to each different robot hardware for reconstructing the original dance performance. The third part tries to answer the question, what is the merit to preserve folk dances by using robot performance by the answer that such symbolic representations for robot performance provide new understandings of those dances. In order to demonstrate this point, we focus on folk dances of native Taiwanese, which consists of 14 different tribes. We have converted those folk dances into Labanotation for robot performance. Further, by analyzing these Labanotations obtained, we can clarify the social relations among these 14 tribes.”




John Payne





Related posts :



Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.

Robot Talk Episode 129 – Automating museum experiments, with Yuen Ting Chan

  17 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Yuen Ting Chan from Natural History Museum about using robots to automate molecular biology experiments.

What’s coming up at #IROS2025?

  15 Oct 2025
Find out what the International Conference on Intelligent Robots and Systems has in store.

From sea to space, this robot is on a roll

  13 Oct 2025
Graduate students in the aptly named "RAD Lab" are working to improve RoboBall, the robot in an airbag.

Robot Talk Episode 128 – Making microrobots move, with Ali K. Hoshiar

  10 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Ali K. Hoshiar from University of Essex about how microrobots move and work together.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence