Robohub.org
 

C. Karen Liu: Modeling Human Movements for Robotics | CMU RI Seminar


by
28 October 2017



share this:

Link to video on YouTube

Abstract: “Creating realistic virtual humans has traditionally been considered a research problem in Computer Animation primarily for entertainment applications. With the recent breakthrough in collaborative robots and deep reinforcement learning, accurately modeling human movements and behaviors has become a common challenge faced by researchers in robotics, artificial intelligence, as well as Computer Animation. In this talk, I will focus on two different yet highly relevant problems: how to teach robots to move like humans and how to teach robots to interact with humans.
While Computer Animation research has shown that it is possible to teach a virtual human to mimic human athletes’ movements, transferring such complex controllers to robot hardware in the real world is perhaps even more challenging than learning the controllers themselves. In this talk, I will focus on two strategies to transfer highly dynamic skills from character animation to robots: teaching robots basic self-preservation motor skills and developing data-driven algorithms on transfer learning between simulation and the real world.
The second part of the talk will focus on robotic assistance with dressing, which is a prominent activities of daily living (ADLs) most commonly requested by older adults. To safely train a robot to physically interact with humans, one can design a generative model of human motion based on prior knowledge or recorded motion data. Although this approach has been successful in Computer Animation, such as generating locomotion, designing procedures for a loosely defined task, such as “being dressed”, is likely to be biased to the specific data or assumptions. I will describe a new approach to modeling human motion without being biased toward specific situations presented in the dataset.”




John Payne





Related posts :



The science of human touch – and why it’s so hard to replicate in robots

  24 Dec 2025
Trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is.

Bio-hybrid robots turn food waste into functional machines

  22 Dec 2025
EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.

Robot Talk Episode 138 – Robots in the environment, with Stefano Mintchev

  19 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Stefano Mintchev from ETH Zürich about robots to explore and monitor the natural environment.

Artificial tendons give muscle-powered robots a boost

  18 Dec 2025
The new design from MIT engineers could pump up many biohybrid builds.

Robot Talk Episode 137 – Getting two-legged robots moving, with Oluwami Dosunmu-Ogunbi

  12 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Oluwami Dosunmu-Ogunbi from Ohio Northern University about bipedal robots that can walk and even climb stairs.

Radboud chemists are working with companies and robots on the transition from oil-based to bio-based materials

  10 Dec 2025
The search for new materials can be accelerated by using robots and AI models.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence