Robohub.org
 

Rehabilitation support robot “R-cloud” makes muscle movement visible

by
05 December 2013



share this:
Rehabilitation support robot "R-cloud" makes muscle movement visible

Associate Professor Toshiaki Tsuji’s Laboratory at Saitama University has developed R-cloud, a rehabilitation support robot that enables users to view how their own muscles move during rehabilitation and training.

“This rehabilitation support robot is used for strengthening the arms. Its moving parts use pneumatic muscles, and it provides support with gentle movements so it is very safe. Another distinguishing feature is haptic signal processing, a technique that estimates muscular force during training and makes this information visible. It also has a feature that quantifies and evaluates the effect of training.”

R-cloud calculates subtle muscle movements and quantifies this into data, which enables physical therapists to provide accurate instruction on movement, and patients to confirm their own movements.

“This robot has a force sensor and a sensor to measure the arm angle. Based on data collected from these sensors, calculations are made on the force of muscle contraction within the arm, as well as on the amount of calories consumed by each muscle during training. So the robot is equipped with technology that quantifies the degree of effectiveness of training. In addition, we use augmented reality technology to make these results visible.”

By databasing measurement data collected during training, the Tsuji Lab is constructing a rehabilitation cloud system which will promote efficient rehabilitation.

“We would like to see these robots used for rehabilitation training in hospitals, nursing homes, and private homes after an injury, or for use as a preventive measure prior to injury. “



tags: , ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



IEEE 17th International Conference on Automation Science and Engineering paper awards (with videos)

The IEEE International Conference on Automation Science and Engineering (CASE) is the flagship automation conference of the IEEE Robotics and Automation Society and constitutes the primary forum for c...
ep.

340

podcast

NVIDIA and ROS Teaming Up To Accelerate Robotics Development, with Amit Goel

Amit Goel, Director of Product Management for Autonomous Machines at NVIDIA, discusses the new collaboration between Open Robotics and NVIDIA. The collaboration will dramatically improve the way ROS and NVIDIA's line of products such as Isaac SIM and the Jetson line of embedded boards operate together.
23 October 2021, by

One giant leap for the mini cheetah

A new control system, demonstrated using MIT’s robotic mini cheetah, enables four-legged robots to jump across uneven terrain in real-time.
23 October 2021, by

Robotics Today latest talks – Raia Hadsell (DeepMind), Koushil Sreenath (UC Berkeley) and Antonio Bicchi (Istituto Italiano di Tecnologia)

Robotics Today held three more online talks since we published the one from Amanda Prorok (Learning to Communicate in Multi-Agent Systems). In this post we bring you the last talks that Robotics Today...
21 October 2021, by and

Sense Think Act Pocast: Erik Schluntz

In this episode, Audrow Nash interviews Erik Schluntz, co-founder and CTO of Cobalt Robotics, which makes a security guard robot. Erik speaks about how their robot handles elevators, how they have hum...
19 October 2021, by and

A robot that finds lost items

Researchers at MIT have created RFusion, a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper, that fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
18 October 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association