Robohub.org
 

Remote internal imaging robot helps doctors in emergency situations


by
12 July 2013



share this:
13-0052-r

This remote medical care robot for use in emergency situations, is under development by a research group at Waseda University, led by Dr. Hiroyasu Iwata.

“If a person receives an impact in an accident, there is a possibility that they could have internal bleeding. In emergency rooms, there’s a diagnostic method called FAST, using ultrasound imaging to check for internal bleeding. But that can’t be done until the patient reaches the hospital. So our idea is that this robot can be put on the patient in an ambulance, and while on the way to the hospital, it can be controlled by a doctor in a remote location. As there is ultrasound probe attached, this robot can be used to check for internal bleeding.”

This robot, which weighs 2.2 kg, can be attached to the chest area using a belt, and can be used anywhere as long as there is a network connection. So it could also be used in the home or remote areas.

To enable a physician at a remote location to operate the robot intuitively, it’s controlled using an iPhone, with the robot’s rotation and the ultrasound probe angle controlled by touch.

“The ultrasound probe is attached here, and as it moves, the ultrasound image appears like this. If there’s bleeding, that appears as black shadows like this. If the patient has internal bleeding, they’re in danger unless they get to a hospital. This system lets the physician know that.”

“One point about this robot is, you can change the probe angle freely, keeping the probe in contact with the body. So, even if the patient is moved, the robot moves with them. This means images can continually be sent to the physician at a remote location.”

“Before this robot can be used in emergency care, legal barriers must be overcome. So, what we’d like to do initially is use it for pregnancy check-ups. By doing that, if we make one more prototype version, we think the robot will become practical. In that case, we think this system could become practical within three years.”



tags: , ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



#IJCAI2025 distinguished paper: Combining MORL with restraining bolts to learn normative behaviour

and   04 Sep 2025
The authors introduce a framework for guiding reinforcement learning agents to comply with social, legal, and ethical norms.

Researchers are teaching robots to walk on Mars from the sand of New Mexico

  02 Sep 2025
Researchers are closer to equipping a dog-like robot to conduct science on the surface of Mars

Engineering fantasy into reality

  26 Aug 2025
PhD student Erik Ballesteros is building “Doc Ock” arms for future astronauts.

RoboCup@Work League: Interview with Christoph Steup

and   22 Aug 2025
Find out more about the RoboCup League focussed on industrial production systems.

Interview with Haimin Hu: Game-theoretic integration of safety, interaction and learning for human-centered autonomy

and   21 Aug 2025
Hear from Haimin in the latest in our series featuring the 2025 AAAI / ACM SIGAI Doctoral Consortium participants.

AIhub coffee corner: Agentic AI

  15 Aug 2025
The AIhub coffee corner captures the musings of AI experts over a short conversation.

Interview with Kate Candon: Leveraging explicit and implicit feedback in human-robot interactions

and   25 Jul 2025
Hear from PhD student Kate about her work on human-robot interactions.

#RoboCup2025: social media round-up part 2

  24 Jul 2025
Find out what participants got up to during the second half of RoboCup2025 in Salvador, Brazil.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence