news    views    talk    learn    |    about    contribute     republish     archives     calls/events

Tag : human-robot interaction

Robohub is an online platform that brings together leading communicators in robotics research, start-ups, business, and education from around the world.
by   -   January 5, 2014

Tjin Van Der Zant helped found “Robocup at Home” in 2006, and since then the organization has spread to include a number of new locations everywhere from Brazil to Thailand. As a professor at the University of Groningen in the Cognitive Robotics Lab, and founder of a Robotics startup and machine learning startup – he’s pretty “involved” when it comes to robots – and it made me eager to pick his brain about the future of home robotics.

by   -   January 3, 2014

Mission to Mars

I hope everyone had a wonderful holiday break!  As always, remember you can see more at Nate’s Website.

by   -   December 23, 2013


Aisoy, a spanish robotics startup, is motivated by the goal of building intelligent, personal, “social” robots, which make our lives easier and funnier. Their robot, the Aisoy1, is their first step towards achieving this vision. Robohub recently caught up with the team, to talk about social robotics, the Aisoy1, and the startup culture in Spain.

by   -   November 29, 2013

In this episode, AJung Moon talks to Julie Carpenter, a recent graduate of the University of Washington who interviewed 23 U.S. Military Explosive Ordnance Disposal personnel to find out how they interact with everyday field robots. Julie is currently writing a book on the topic that is scheduled to be published next year.

by   -   November 8, 2013

BrianDavidJohnsonGuest talk in the ShanghAI Lectures, 2009-10-29

How can science fiction help prototype emerging science theory and experimentation? Expanding on the framework of consumer experience architecture, this talk explores how a fictional story, based specifically on current works of scientific research, can lead to the expansion and further experimentation of a dramatic new approach to artificial intelligence and domestic robots.

by   -   November 8, 2013


As always, remember you can see more at Nate’s Website.

by   -   August 27, 2013

Well Played, Internet

As always, remember you can see more at Nate’s Website.

by   -   August 15, 2013 a robot animator I can attest to the fact that robots don’t “need” heads to be treated as social entities. Research has shown that people will befriend a stick as long as it moves properly [1].

We have a long-standing habit of anthropomorphizing things that aren’t human by attributing to them human-level personality traits or internal motivations based on cognitive-affective architectures that just aren’t there. Animators have relied on the audience’s willingness to suspend disbelief and, in essence, co-animate things into existence: from a sack of flour to a magic broom. It’s possible to incorporate the user’s willingness to bring a robot to life by appropriately setting expectations and being acutely aware of how the context of interaction affects possible outcomes.

In human lifeforms, a head usually circumscribes a face, whereas in a robot a face can be placed anywhere. Although wonderfully complex, in high degree of freedom (DOF) robot heads, the facial muscles can be challenging to orchestrate with sufficient timing precision. If your robot design facilitates expression through the careful control of the quality of motion rendered, a head isn’t necessary in order to communicate essential non-verbal cues. As long as you consider a means for revealing the robot’s internal state, a head simply isn’t needed. A robot’s intentions can be conveyed through expressive motion and sound regardless of its form or body configuration.

[1] Harris, J., & Sharlin, E. (2011, July). Exploring the affect of abstract motion in social human-robot interaction. In RO-MAN, 2011 IEEE (pp. 441-448). IEEE.


by   -   August 15, 2013 obvious answer to this question is “No: there are lots of robots without heads.” It’s not even clear that social robots necessarily require a head, as even mundane robots like the Roomba are anthropomorphized (taking on human-like qualities) without a head. A follow-up question might be, “How are heads useful?” For humans, the reasons are apparent: food intake, a vessel for our brain, a locus for sensors (eyes and ears), and high-bandwidth communication via expression. What about for robots …?

  • Food intake: Probably not.
  • Computational storage: Again, probably not.
  • Location for sensors: Indeed, the apex of a robot is a natural, obstacle-free vantage point for non-contact sensors. But a “head” form factor is not a strict requirement.
  • Emotion and expression: Ah, the real meat of this question… “Do robots need to express emotion?”

This is a funny question to ask someone who once (in)famously advocated for either (A) extremely utilitarian designs: “I want my eventual home robot to be as unobtrusive as a trashcan or dishwasher”, or (B) designs unconstrained by the human form factor: “Why not give robots lots of arms (or only one)? Why impose human-like joint limits, arm configurations, and sensing? We can design our own mechanical lifeforms!”

My views have softened a bit over time. Early (expensive) general-purpose home robots will almost certainly have humanoid characteristics and have heads with the ability to express emotions (i.e. be social) — if nothing else, to appeal to the paying masses. And these robots will be useful: doing my laundry, cleaning my dishes, and cooking my meals. In the early attempts, I will still find their shallow attempts at emotion mundane and I will probably detest the sales pitches about “AI” and “robots that feel.” But as the emotional expressions become more natural and nuanced, and the robots become more capable, I will probably warm up to the idea myself.

TL;DR: No, many robots do not need heads. Even social robots may not need heads, but (whether I want them to or not) they probably will, because paying consumers will expect it.

by   -   July 26, 2013

High Reward

As always, remember you can see more at Nate’s Website.

by   -   June 4, 2013

Researchers from the University of Minnesota have developed a non-invasive brain/computer interface that allows humans to remotely control a robot (in this case, a quadrotor) using only their thoughts. The research team, led by Bin He, Professor of Biomedical Engineering, hopes this technology can one day be used to help people with speech and mobility problems.

According to research team member Karl LaFleur, “If you imagine making a fist with your right hand, it turns the robot to the right. And if you imagine making a fist with both hands, it moves the robot up.”

The beauty of this research is that no implants are required to interface with the system.

Editor's Picks