“Human-robot collaboration has the potential to transform the way people work and live. Researchers are currently developing robots that assist people in public spaces, on the job, and in their homes. To be effective assistants, these robots must be able to recognize aspects of their human partners such as what their goals are, what their next action will be, and when they need help—in short, their task-relevant mental states. A large part of communication about mental states occurs nonverbally, through eye gaze, gestures, and other behaviors that provide implicit information. Therefore, to be effective collaborators, robots must understand nonverbal human communication as well as generate sufficiently expressive nonverbal behaviors that are understandable by their human partners. Developing effective human-robot interactions requires a multidisciplinary approach that involves fundamental robotics algorithms, insights from human psychology, and techniques from artificial intelligence, machine learning, and computer vision. In this talk, I will describe my work on robots that collaborate with and assist humans on complex tasks, such as eating a meal. I will show how robots can guide human action using nonverbal behaviors, and how natural, intuitive human behaviors can reveal human mental states that robots must respond to. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.”