Robohub.org
 

IBM, Intel describe experiential future for human-machine interaction


by
21 January 2016



share this:

Brian_Krzanich_Intel_Ginni_Rometty_IBM_CES_2016In keynote presentations at CES in Las Vegas last week, CEOs from IBM and Intel talked about disruptive changes in how consumers and businesses are transacting and interacting with their purchases.

Intel CEO Brian Krzanich during his keynote at CES 2016. Source: Consumer Technology Association

Intel CEO Brian Krzanich during his keynote at CES 2016. Source: Consumer Technology Association

Intel’s Brian Krzanich described the use of embedded chips and cloud services to enable experience-based transactions. While many of these services are about driving sales, others are about enhancing user experience. Krzanich provided many examples, but four stood out as he made his case:

  1. In a new Guinness World Record for the most unmanned aerial vehicles airborne simultaneously, 100 flying drones flew and shone their colored lights in sync with an orchestra playing Beethoven’s Fifth Symphony. Because it was so colorful, and timed to the music, Krzanich characterized the event as a potentially safer and reusable replacement for fireworks shows.https://youtu.be/mOBQXuu_5Zw
  2. Using augmented reality, the ModiFace mirror enables consumers to try out different makeup looks (blush, eye shadow, lip stick color, foundation, glosses, etc.) and then buy their favorite color and product combinations. Typically done in public in department stores, sampling makeup can be an embarrassing process for some. An augmented reality product such as this one disrupts the process and provides a better customer experience.
    Modiface_Mirror_Smart_Consumer_Product
  3. Krzanich also demonstrated a Yuneec Typhoon H with an onboard RealSense 3D camera and chip, enabling a follow-me collision-avoiding drone that will be available later this year as a consumer product.
  4. A pair of Oakley sunglasses provided the sensors and communication to track, interact, coach and provide data extrapolated from sensors tracking the wearer’s progress, comparing that progress to set goals, and providing spoken coaching and responsive reports along the way – the result being an experience instead of a pair of sunglasses.

(As an aside, Intel recently announced its acquisition of Ascending Technologies, the German developer of the collision avoidant autopilot system used in the video above. Last year Intel also made a big investment in Yuneec (whose drones now include onboard Intel RealSense cameras and chips) and Airware (a competing autopilot developer). And this week Intel invested in robotics startup Savioke, maker of the Relay robot, which autonomously navigates around hotels delivering toothpaste, towels, Starbucks coffee and other items.)

IBM CEO Ginny Rometty onstage at CES 2016. Source: Consumer Technology Association

IBM CEO Ginni Rometty onstage at CES 2016. Source: Consumer Technology Association

IBM CEO Ginni Rometty told a packed CES audience that Watson and IBM were changing the nature of data processing from transactional to cognitive. Last year Robert High (CTO of IBM’s Watson Group) said that the emerging era of embodied cognitive computing is leading to “cognition as a service” — for example, a welcome 3rd hand for a lab technician, or a concierge or office assistant such as Jibo or Echo, or an assistant in field settings like search and rescue. The cognition process involves machines interacting with humans in a variety of ways, including via text, verbally, with tactile and visual cues, and with gestures. Cognitive algorithms and cloud computing are the keys to Watson’s feats, and they also happen to be where Rometty has pushed IBM since her tenure began.

Rometty described Watson’s progress thus far and demonstrated on stage many areas where applying Watson to an application is changing the nature of the experience. It’s not enough to just be able to hear what a user says. Watson must be able to understand what they want, be able to make that happen, and then interact back with the user in as conversational a tone as possible. SoftBank’s Pepper robot is an example of how this works. In this excerpt from Rometty’s presentation, she and SoftBank Robotics’ Kenichi Yoshida announced that IBM will provide global distribution and support for SoftBank’s Watson-powered Pepper robot as they scale up to begin selling into China and the US. Pepper has already sold 7,000 units in Japan and is in 300 bank branches, and 100 stores.

 

Bottom line

These two CES keynote presentations illustrate how artificial intelligence and data synthesis will provide the backbone to enable meaningful and productive interaction between humans and machines — not only on screens, but with gestures, visual cues and spoken understandable communication to and from smart devices and robots of all types. And it’s not just a near-term future they foretell — they give examples of where it is happening already.



tags: , ,


Frank Tobe is the owner and publisher of The Robot Report, and is also a panel member for Robohub's Robotics by Invitation series.
Frank Tobe is the owner and publisher of The Robot Report, and is also a panel member for Robohub's Robotics by Invitation series.





Related posts :



Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.

Interview with Amar Halilovic: Explainable AI for robotics

  10 Jun 2025
Find out about Amar's research investigating the generation of explanations for robot actions.

Robot Talk Episode 124 – Robots in the performing arts, with Amy LaViers

  06 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Amy LaViers from the Robotics, Automation, and Dance Lab about the creative relationship between humans and machines.

Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence