This article returns to the thread of the last few months by looking at how robots can measure our emotions and body language.
My aunt, a Tennessee tobacco grower, used to remind me that God gave me two ears and one mouth for a reason. What she meant is that a good conversationalist is not so much someone with the ability to talk, but the ability to listen.
Robots can take a cue from my aunt.
This article discusses how body language is a part of natural language, personality, and NLP design. The article covers various methods for approaching this problem and makes recommendations for the real-time generation of animation to accompany natural language for avatars and robots.
It’s hard to communicate with words. Some researchers claim that almost half of our communication relies on things that aren’t words: body language, tone of voice, and stuff that just isn’t conveyed by text. This includes prosody (tone, pitch and speed of words), facial expression, hand gesture, stance and posture. This probably explains why about 40% of emails are misunderstood.
Kirobo, a 13” humanoid robot, was launched today and is on its way to the International Space Station.
A mashup of the Japanese word for hope, “Kibo”, and “robot”, Kirobo is designed to be a companion and communicator for Japanese astronaut Koichi Wakata who is expected to arrive at the space station later this year.
This article considers privacy in robotic systems (such as personal service robotics) as being of greater importance than privacy in telecommunications (such as Internet). We will return to our regularly scheduled program – about gestures and body language – next month.
Let’s say a white box showed up on your doorstep and you open it up and find a little humanoid robot made by Google. A GoogleBot! The brightly-colored pamphlet says that the little disk of a device will vacuum your floor, all for the same cost as your Gmail account: free.
This article discusses personality design and how proper natural language interface design includes body language. The article is about the design of hearts and minds for robots. It argues that psychology must be graphically represented, that body language is a means to do that, and points out why this is kind of funny.
Comrades, we live in a bleak and humourless world. Here we are thirteen years into the twenty-first century, and we all carry around Star Trek style tri-corders, we have access to almost all human opinions via this awesome global computer network, we have thousands and thousands of channels we can flip through on television, we have something like 48,000 people signed up to go colonize Mars, and we even have robots roaming around up there, taking samples of that planet. But we still don’t have robots that can tell a good joke.
This article outlines the problems of today’s phone and online help systems and offers solutions to conversational systems of tomorrow. The article is about the design of hearts and minds for robots, considers the virtual voice as a legitimate robot, and takes a fast pass at the psychology of robot-human interaction.
This article looks at how the robotics industry of today is following in the footsteps of the personal computer industry of yesterday, and why Natural Language Processing, like the Graphical User Interface, plays a key role in this industry-wide evolution.