Robohub.org
 

Stefanie Tellex: Learning Models of Language, Action and Perception for HRC | CMU RI Seminar

by
10 October 2016



share this:

Link to video of seminar on YouTube

Abstract: “Robots can act as a force multiplier for people, whether a robot assisting an astronaut with a repair on the International Space station, a UAV taking flight over our cities, or an autonomous vehicle driving through our streets. To achieve complex tasks, it is essential for robots to move beyond merely interacting with people and toward collaboration, so that one person can easily and flexibly work with many autonomous robots. The aim of my research program is to create autonomous robots that collaborate with people to meet their needs by learning decision-theoretic models for communication, action, and perception. Communication for collaboration requires models of language that map between sentences and aspects of the external world. My work enables a robot to learn compositional models for word meanings that allow a robot to explicitly reason and communicate about its own uncertainty, increasing the speed and accuracy of human-robot communication. Action for collaboration requires models that match how people think and talk, because people communicate about all aspects of a robot’s behavior, from low-level motion preferences (e.g., “Please fly up a few feet”) to high-level requests (e.g.,”Please inspect the building”). I am creating new methods for learning how to plan in very large, uncertain state-action spaces by using hierarchical abstraction. Perception for collaboration requires the robot to detect, localize, and manipulate the objects in its environment that are most important to its human collaborator. I am creating new methods for autonomously acquiring perceptual models in situ so the robot can perceive the objects most relevant to the human’s goals. My unified decision-theoretic framework supports data-driven training and robust, feedback-driven human-robot collaboration.”




John Payne





Related posts :



Unable to attend #ICRA2022 for accessibility issues? Or just curious to see robots?

There are many things that can make it difficult to attend an in person conference in the United States and so the ICRA Organizing Committee, the IEEE Robotics and Automation Society and OhmniLabs would like to help you attend ICRA virtually.
17 May 2022, by
ep.

350

podcast

Duckietown Competition Spotlight, with Dr Liam Paull

Dr. Liam Paull, cofounder of the Duckietown competition talks about the only robotics competition where Rubber Duckies are the passengers on an autonomous driving track.
17 May 2022, by

Designing societally beneficial Reinforcement Learning (RL) systems

In this post, we aim to illustrate the different modalities harms can take when augmented with the temporal axis of RL. To combat these novel societal risks, we also propose a new kind of documentation for dynamic Machine Learning systems which aims to assess and monitor these risks both before and after deployment.
15 May 2022, by

Innovative ‘smart socks’ could help millions living with dementia

‘Smart socks’ that track rising distress in the wearer could improve the wellbeing of millions of people with dementia, non-verbal autism and other conditions that affect communication.
13 May 2022, by

Swiss Robotics Day showcases innovations and collaborations between academia and industry

The 2021 Swiss Robotics Day marked the beginning of NCCR Robotics’s final year. The project, launched in 2010, is on track to meet all its scientific goals in the three areas of wearable, rescue and educational robotics, while continuing to focus on supporting spin-offs, advancing robotics education and improving equality of opportunities for all robotics researchers.
10 May 2022, by

Afreez Gan: Open Source Robot Dog, Kickstarter, and Home Robots | Sense Think Act Podcast #18

In this episode, Audrow Nash speaks to Afreez Gan, who is the founder and CEO of MangDang; MangDang is a Chinese startup that makes Minipupper, an open source robot dog that uses the Robot Operating S...





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association