Robohub.org
 

Robots can successfully imitate human motions in the operating room

by
26 September 2016



share this:
The human-like and the non-human-like trajectories were performed in a random order (10 human-like and 10 non-human-like). Photo: Courtesy of Dr. Elena De Momi, Politecnico di Milano.

The human-like and the non-human-like trajectories were performed in a random order (10 human-like and 10 non-human-like). Photo: Courtesy of Dr. Elena De Momi, Politecnico di Milano.

By: Marcus Banks

The nursing assistant for your next trip to the hospital might be a robot. This is the implication of research recently published by Dr. Elena De Momi and colleagues in the open access journal Frontiers in Robotics and AI (Artificial Intelligence).

Dr. De Momi, of the Politecnico di Milano (Italy), led an international team that trained a robot to imitate natural human actions. De Momi’s work indicates that humans and robots can effectively coordinate their actions during high-stakes events such as surgeries.

Over time this should lead to improvements in safety during surgeries because unlike their human counterparts robots do not tire and can complete an endless series of precise movements. The goal is not to remove human expertise from the operating room, but to complement it with a robot’s particular skills and benefits.

“As a roboticist, I am convinced that robotic (co)workers and collaborators will definitely change the work market, but they won’t steal job opportunities. They will just allow us to decrease workload and achieve better performances in several tasks, from medicine to industrial applications,” De Momi explains.

To conduct their experiment De Momi’s team photographed a human being conducting numerous reaching motions, in a way similar to handing instruments to a surgeon. These camera captures were input into the neural network of the robotic arm, which is crucial to controlling movements. Next, a human operator guided the robotic arm in imitating the reaching motions that the human subject had initially performed. Although there was not a perfect overlap between the robotic and human actions, they were broadly similar.

Finally, several humans observed as the robotic arm made numerous motions. These observers determined whether the actions of the robotic arms were “biologically inspired,” which would indicate that their neural networks had effectively learned to imitate human behavior. About 70% of the time this is exactly what the human observers concluded.

These results are promising, although further research is necessary to validate or refine De Momi’s conclusions. If robotic arms can indeed imitate human behavior, it would be necessary to build conditions in which humans and robots can cooperate effectively in high-stress environments like operating rooms.

This future may not be as far away as we think. De Momi’s work is part of the growing field of healthcare robotics, which has the potential to change the way we receive health care sooner rather than later.

Read the research paper here.


If you liked this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , , ,


Frontiers in Robotics and AI is an open-access journal that provides thematic coverage on areas across robotics and artificial intelligence.
Frontiers in Robotics and AI is an open-access journal that provides thematic coverage on areas across robotics and artificial intelligence.





Related posts :



Unable to attend #ICRA2022 for accessibility issues? Or just curious to see robots?

There are many things that can make it difficult to attend an in person conference in the United States and so the ICRA Organizing Committee, the IEEE Robotics and Automation Society and OhmniLabs would like to help you attend ICRA virtually.
17 May 2022, by
ep.

350

podcast

Duckietown Competition Spotlight, with Dr Liam Paull

Dr. Liam Paull, cofounder of the Duckietown competition talks about the only robotics competition where Rubber Duckies are the passengers on an autonomous driving track.
17 May 2022, by

Designing societally beneficial Reinforcement Learning (RL) systems

In this post, we aim to illustrate the different modalities harms can take when augmented with the temporal axis of RL. To combat these novel societal risks, we also propose a new kind of documentation for dynamic Machine Learning systems which aims to assess and monitor these risks both before and after deployment.
15 May 2022, by

Innovative ‘smart socks’ could help millions living with dementia

‘Smart socks’ that track rising distress in the wearer could improve the wellbeing of millions of people with dementia, non-verbal autism and other conditions that affect communication.
13 May 2022, by

Swiss Robotics Day showcases innovations and collaborations between academia and industry

The 2021 Swiss Robotics Day marked the beginning of NCCR Robotics’s final year. The project, launched in 2010, is on track to meet all its scientific goals in the three areas of wearable, rescue and educational robotics, while continuing to focus on supporting spin-offs, advancing robotics education and improving equality of opportunities for all robotics researchers.
10 May 2022, by

Afreez Gan: Open Source Robot Dog, Kickstarter, and Home Robots | Sense Think Act Podcast #18

In this episode, Audrow Nash speaks to Afreez Gan, who is the founder and CEO of MangDang; MangDang is a Chinese startup that makes Minipupper, an open source robot dog that uses the Robot Operating S...





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association