Robohub.org
 

Emotive communication with things – EmoShape Founder Patrick Levy Rosenthal


by and
14 January 2014



share this:
Holding-a-heart
Photo credit: Image Agency

Why do people who use Facebook spend so much of their online time there? Why do people want to share, to comment?
Patrick Levy Rosenthal asked himself these answers and was drawn back again and again to: Emotion. Researcher and Parisian, Patrick now lives in London working on his startup, EmoSpace.

Over the course of our conversation, he painted a bit of his vision for an emotionally attuned physical world, where objects and devices around us can adjust their function to have a desirable effect on our emotional state, or to simply “sync” with us at an emotional level. I spoke a bit about this same topic with RockPaperRobot’s Jessica Banks, but Patrick’s business concept – and life’s work – is creating the first user interface for emotionally calibrating devices. It starts with a small cube-shaped object “EmoSPARK,” which Patrick plans to have fitted with the technology to connect to a number of other household devices – beginning (most likely) with an mp3 music player.

The first question is: How will this machine detect it’s owner’s emotional state?

Patrick explains that this should be one of the simpler challenges of the device. “It tracks over 180 points on your face, but also the relation between those points, so that if you are smiling it will know that your lips will be stretched, and eyes made more narrow.” The machine will also be able to detect movement and voice tonality in order discern the emotional state of it’s owner.

In terms of the application of this data, Patrick started with the example of tying the emotional feedback to a basic music device. If you walk into a a room and display plenty of signals of sadness (slumped shoulders, low voice tones, frown), the machine will aim to bring your emotion up another level via a piece of music that might bump your emotional state upwards. In doing so, the machine will aim to detect the effect that the music is having on your emotions, or the machine may simply ask you if you like the music, or what you’re in the mood for. In doing so, it can remember your responses to certain stimuli, and remember which stimuli tended to be most helpful for you in which situations, and so perform it’s job (improving your state / creating a desirable environment) better day by day. Patrick’s vision is for lighting, computers, and myriad other devices to have a similar kind of resonation with your emotions by attaching them to the cube.

“For the last 20 years, I believe that robotics and artificial intelligence failed humans.” He admits that in many respects, robots are impressive in their modern feats, but that they are nowhere near the level of intelligence that many people supposed they might be a few decades ago. ”We still see them as a bunch of silicon… we know that they don’t understand what we feel.” Patrick also mentions the uncanny valley, and how more realistic robots are often more threatening and troubling than ones which do not look like humans, and how a machine like Disney’s WALL-E appeals to us because it is not humanoid (read: creepy), and because WALL-E is clearly an emotional creature.

I asked Patrick a bit about the potential risks for an emotionally intelligent machine. It would seem that a machine which is capable of experiencing emotion itself might be capable of doing what humans do when they get emotional. Namely: rash things, violent things, irrational or unpredictable things. Patrick explained that he will have laws programmed into his machines which do not permit them to – say – harm humans, or act with any kind of malicious intent (for those of you who are interested, his rules are modeled and extrapolated from Asimov’s robot lows). “Even if you hurt the feelings of the cube, it will always gear it’s actions towards making you feel better,” he says.

In the end, Patrick believes that emotional intelligence in machines will – on the aggregate – make us less likely to run into a “Terminator” situation in the future. For his sake and mine, I hope so, too.



tags: , ,


Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.
Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.

TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.
TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.





Related posts :



Interview with Zahra Ghorrati: developing frameworks for human activity recognition using wearable sensors

and   08 Oct 2025
Zahra tells us more about her research on wearable technology.

Women in robotics you need to know about 2025

  06 Oct 2025
This global list celebrates women's impact across the robotics ecosystem and globe.

Robot Talk Episode 127 – Robots exploring other planets, with Frances Zhu

  03 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Frances Zhu from the Colorado School of Mines about intelligent robotic systems for space exploration.

Rethinking how robots move: Light and AI drive precise motion in soft robotic arm

  01 Oct 2025
Researchers at Rice University have developed a soft robotic arm capable of performing complex tasks.

RoboCup Logistics League: an interview with Alexander Ferrein, Till Hofmann and Wataru Uemura

and   25 Sep 2025
Find out more about the RoboCup league focused on production logistics and the planning.

Drones and Droids: a co-operative strategy game

  22 Sep 2025
Scottish Association for Marine Science is running a crowdfunding campaign for educational card game.

Call for AAAI educational AI videos

  22 Sep 2025
Submit your contributions by 30 November 2025.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence