Robohub.org
 

Stefanie Tellex: Learning Models of Language, Action and Perception for HRC | CMU RI Seminar

by
10 October 2016



share this:

Link to video of seminar on YouTube

Abstract: “Robots can act as a force multiplier for people, whether a robot assisting an astronaut with a repair on the International Space station, a UAV taking flight over our cities, or an autonomous vehicle driving through our streets. To achieve complex tasks, it is essential for robots to move beyond merely interacting with people and toward collaboration, so that one person can easily and flexibly work with many autonomous robots. The aim of my research program is to create autonomous robots that collaborate with people to meet their needs by learning decision-theoretic models for communication, action, and perception. Communication for collaboration requires models of language that map between sentences and aspects of the external world. My work enables a robot to learn compositional models for word meanings that allow a robot to explicitly reason and communicate about its own uncertainty, increasing the speed and accuracy of human-robot communication. Action for collaboration requires models that match how people think and talk, because people communicate about all aspects of a robot’s behavior, from low-level motion preferences (e.g., “Please fly up a few feet”) to high-level requests (e.g.,”Please inspect the building”). I am creating new methods for learning how to plan in very large, uncertain state-action spaces by using hierarchical abstraction. Perception for collaboration requires the robot to detect, localize, and manipulate the objects in its environment that are most important to its human collaborator. I am creating new methods for autonomously acquiring perceptual models in situ so the robot can perceive the objects most relevant to the human’s goals. My unified decision-theoretic framework supports data-driven training and robust, feedback-driven human-robot collaboration.”




John Payne





Related posts :



Robots can be companions, caregivers, collaborators — and social influencers

People are hardwired to respond socially to technology that presents itself as even vaguely social. While this may sound like the beginnings of a Black Mirror episode, this tendency is precisely what allows us to enjoy social interactions with robots and place them in caregiver, collaborator or companion roles.
26 November 2021, by

Interview with Tao Chen, Jie Xu and Pulkit Agrawal: CoRL 2021 best paper award winners

The award-winning authors describe their work on a system for general in-hand object re-orientation.
24 November 2021, by
ep.

341

podcast

How Simbe Robotics is Innovating in Retail, with Brad Bogolea

Brad Bogolea discusses the innovation behind Tally, the autonomous robot from Simbe Robotics. Tally collects real-time analytics inside retail stores to improve the customer shopping experience, as well as the efficiency of managing the store.
23 November 2021, by

Top 10 recommendations for a video gamer who you’d like to read (or even just touch) a book

Here is the Robotics Through Science Fiction Top 10 recommendations of books that have robots plus enough world building to rival Halo or Doom and lots of action or puzzles to solve. What’s even cooler is that you can cleverly use the “Topics” links to work in some STEM talking points.
20 November 2021, by

Top tweets from the Conference on Robot Learning #CoRL2021

In this post we bring you a glimpse of the conference through the most popular tweets about the conference written last week. Cool robot demos, short and sweet explanation of papers and award finalists to look forward to next year's edition.
19 November 2021, by

Finding inspiration in starfish larva

Researchers at ETH Zurich have developed a tiny robot that mimics the movement of a starfish larva. It is driven by sound waves and equipped with tiny hairs that direct the fluid around it, just like its natural model. In the future, such microswimmers could deliver drugs to diseased cells with pinpoint accuracy.
17 November 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association