Robohub.org
 

Emotive communication with things – EmoShape Founder Patrick Levy Rosenthal


by and
14 January 2014



share this:
Holding-a-heart
Photo credit: Image Agency

Why do people who use Facebook spend so much of their online time there? Why do people want to share, to comment?
Patrick Levy Rosenthal asked himself these answers and was drawn back again and again to: Emotion. Researcher and Parisian, Patrick now lives in London working on his startup, EmoSpace.

Over the course of our conversation, he painted a bit of his vision for an emotionally attuned physical world, where objects and devices around us can adjust their function to have a desirable effect on our emotional state, or to simply “sync” with us at an emotional level. I spoke a bit about this same topic with RockPaperRobot’s Jessica Banks, but Patrick’s business concept – and life’s work – is creating the first user interface for emotionally calibrating devices. It starts with a small cube-shaped object “EmoSPARK,” which Patrick plans to have fitted with the technology to connect to a number of other household devices – beginning (most likely) with an mp3 music player.

The first question is: How will this machine detect it’s owner’s emotional state?

Patrick explains that this should be one of the simpler challenges of the device. “It tracks over 180 points on your face, but also the relation between those points, so that if you are smiling it will know that your lips will be stretched, and eyes made more narrow.” The machine will also be able to detect movement and voice tonality in order discern the emotional state of it’s owner.

In terms of the application of this data, Patrick started with the example of tying the emotional feedback to a basic music device. If you walk into a a room and display plenty of signals of sadness (slumped shoulders, low voice tones, frown), the machine will aim to bring your emotion up another level via a piece of music that might bump your emotional state upwards. In doing so, the machine will aim to detect the effect that the music is having on your emotions, or the machine may simply ask you if you like the music, or what you’re in the mood for. In doing so, it can remember your responses to certain stimuli, and remember which stimuli tended to be most helpful for you in which situations, and so perform it’s job (improving your state / creating a desirable environment) better day by day. Patrick’s vision is for lighting, computers, and myriad other devices to have a similar kind of resonation with your emotions by attaching them to the cube.

“For the last 20 years, I believe that robotics and artificial intelligence failed humans.” He admits that in many respects, robots are impressive in their modern feats, but that they are nowhere near the level of intelligence that many people supposed they might be a few decades ago. ”We still see them as a bunch of silicon… we know that they don’t understand what we feel.” Patrick also mentions the uncanny valley, and how more realistic robots are often more threatening and troubling than ones which do not look like humans, and how a machine like Disney’s WALL-E appeals to us because it is not humanoid (read: creepy), and because WALL-E is clearly an emotional creature.

I asked Patrick a bit about the potential risks for an emotionally intelligent machine. It would seem that a machine which is capable of experiencing emotion itself might be capable of doing what humans do when they get emotional. Namely: rash things, violent things, irrational or unpredictable things. Patrick explained that he will have laws programmed into his machines which do not permit them to – say – harm humans, or act with any kind of malicious intent (for those of you who are interested, his rules are modeled and extrapolated from Asimov’s robot lows). “Even if you hurt the feelings of the cube, it will always gear it’s actions towards making you feel better,” he says.

In the end, Patrick believes that emotional intelligence in machines will – on the aggregate – make us less likely to run into a “Terminator” situation in the future. For his sake and mine, I hope so, too.



tags: , ,


Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.
Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.

TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.
TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.


Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 150 – House building robots, with Vikas Enti

  27 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Vikas Enti from Reframe Systems about using robotics and automation to build climate-resilient, high-performance homes.

A history of RoboCup with Manuela Veloso

and   24 Mar 2026
Find out how RoboCup got started and how the competition has evolved, from one of the co-founders.

Robot Talk Episode 149 – Robot safety and security, with Krystal Mattich

  20 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Krystal Mattich from Brain Corp about trustworthy autonomous robots in public spaces.

A multi-armed robot for assisting with agricultural tasks

  18 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.

Restoring surgeons’ sense of touch with robotic fingertips

  10 Mar 2026
Researchers are developing robotic “fingertips” that could give surgeons back their sense of touch during minimally invasive and robotic operations.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence