Robohub.org
 

Henny Admoni: Understanding Human Behavior for Robotic Assistance and Collaboration | CMU RI Seminar


by
01 May 2019



share this:

Link to video on YouTube

“Human-robot collaboration has the potential to transform the way people work and live. Researchers are currently developing robots that assist people in public spaces, on the job, and in their homes. To be effective assistants, these robots must be able to recognize aspects of their human partners such as what their goals are, what their next action will be, and when they need help—in short, their task-relevant mental states. A large part of communication about mental states occurs nonverbally, through eye gaze, gestures, and other behaviors that provide implicit information. Therefore, to be effective collaborators, robots must understand nonverbal human communication as well as generate sufficiently expressive nonverbal behaviors that are understandable by their human partners. Developing effective human-robot interactions requires a multidisciplinary approach that involves fundamental robotics algorithms, insights from human psychology, and techniques from artificial intelligence, machine learning, and computer vision. In this talk, I will describe my work on robots that collaborate with and assist humans on complex tasks, such as eating a meal. I will show how robots can guide human action using nonverbal behaviors, and how natural, intuitive human behaviors can reveal human mental states that robots must respond to. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.”




John Payne

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence