Robohub.org
 

Robots can successfully imitate human motions in the operating room


by
26 September 2016



share this:
The human-like and the non-human-like trajectories were performed in a random order (10 human-like and 10 non-human-like). Photo: Courtesy of Dr. Elena De Momi, Politecnico di Milano.

The human-like and the non-human-like trajectories were performed in a random order (10 human-like and 10 non-human-like). Photo: Courtesy of Dr. Elena De Momi, Politecnico di Milano.

By: Marcus Banks

The nursing assistant for your next trip to the hospital might be a robot. This is the implication of research recently published by Dr. Elena De Momi and colleagues in the open access journal Frontiers in Robotics and AI (Artificial Intelligence).

Dr. De Momi, of the Politecnico di Milano (Italy), led an international team that trained a robot to imitate natural human actions. De Momi’s work indicates that humans and robots can effectively coordinate their actions during high-stakes events such as surgeries.

Over time this should lead to improvements in safety during surgeries because unlike their human counterparts robots do not tire and can complete an endless series of precise movements. The goal is not to remove human expertise from the operating room, but to complement it with a robot’s particular skills and benefits.

“As a roboticist, I am convinced that robotic (co)workers and collaborators will definitely change the work market, but they won’t steal job opportunities. They will just allow us to decrease workload and achieve better performances in several tasks, from medicine to industrial applications,” De Momi explains.

To conduct their experiment De Momi’s team photographed a human being conducting numerous reaching motions, in a way similar to handing instruments to a surgeon. These camera captures were input into the neural network of the robotic arm, which is crucial to controlling movements. Next, a human operator guided the robotic arm in imitating the reaching motions that the human subject had initially performed. Although there was not a perfect overlap between the robotic and human actions, they were broadly similar.

Finally, several humans observed as the robotic arm made numerous motions. These observers determined whether the actions of the robotic arms were “biologically inspired,” which would indicate that their neural networks had effectively learned to imitate human behavior. About 70% of the time this is exactly what the human observers concluded.

These results are promising, although further research is necessary to validate or refine De Momi’s conclusions. If robotic arms can indeed imitate human behavior, it would be necessary to build conditions in which humans and robots can cooperate effectively in high-stress environments like operating rooms.

This future may not be as far away as we think. De Momi’s work is part of the growing field of healthcare robotics, which has the potential to change the way we receive health care sooner rather than later.

Read the research paper here.


If you liked this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags:


Frontiers in Robotics and AI is an open-access journal that provides thematic coverage on areas across robotics and artificial intelligence.
Frontiers in Robotics and AI is an open-access journal that provides thematic coverage on areas across robotics and artificial intelligence.





Related posts :



Rethinking how robots move: Light and AI drive precise motion in soft robotic arm

  01 Oct 2025
Researchers at Rice University have developed a soft robotic arm capable of performing complex tasks.

RoboCup Logistics League: an interview with Alexander Ferrein, Till Hofmann and Wataru Uemura

and   25 Sep 2025
Find out more about the RoboCup league focused on production logistics and the planning.

Drones and Droids: a co-operative strategy game

  22 Sep 2025
Scottish Association for Marine Science is running a crowdfunding campaign for educational card game.

Call for AAAI educational AI videos

  22 Sep 2025
Submit your contributions by 30 November 2025.

Self-supervised learning for soccer ball detection and beyond: interview with winners of the RoboCup 2025 best paper award

  19 Sep 2025
Method for improving ball detection can also be applied in other fields, such as precision farming.

#ICML2025 outstanding position paper: Interview with Jaeho Kim on addressing the problems with conference reviewing

  15 Sep 2025
Jaeho argues that the AI conference peer review crisis demands author feedback and reviewer rewards.

Apertus: a fully open, transparent, multilingual language model

  11 Sep 2025
EPFL, ETH Zurich and the Swiss National Supercomputing Centre (CSCS) released Apertus today, Switzerland’s first large-scale, open, multilingual language model.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence