Laboratory animal management robot can care for 30,000 mice

17 July 2013

share this:

This robot is being developed to automate the management of laboratory animal colonies used by pharmaceutical companies and research institutions, primarily those that raise from 10,000 to 30,000 mice or rats. It’s currently under development by Nikkyo Technos and Yaskawa Electric.

“The biggest problem in raising animals is that diseases can spread from people to the animals. If that happens, all the animals have to be killed and replaced with new ones. So, infection by people must be prevented. By managing animals using robots in an enclosed space, it’s basically possible to eliminate the spread of diseases from animals to people or from people to animals.”

This six-axis, vertical, multi-jointed robot can mimic the motions of a human. It can change cage sheets, top up the food, and change the water. Taking out cages, changing sheets, and topping up food are each done with separate tools, which the robot picks up in turn. In this model, the amount of food remaining isn’t taken into account. But in the next model, a camera will be used to see how much food is left, so the robot can add the right amount.

“Mice, especially, are nervous animals, so the robot handles the cages gently. These tasks account for about 80% of the work involved with lab animals. So, our aim is to automate the hard, dirty, and dangerous task of dealing with so much dust and droppings.”

This robot can also work in coordination with a robot that carries cages from the rack to the workbench, and a monitoring system for the animal facility. In this way, it can automate all aspects of animal raising, from surveillance to care.

“The animals can be monitored with cameras 24/7. So, people can check their own cages from the monitoring station. The animals’ body temperature can also be managed. It takes about two hours for the cages to come back from the lab, but data can be viewed directly from a PC in the monitoring room. So, people can see the cages they want right away, wherever they are.”

“Doing this work with robots makes it much faster, so lots of cages can be handled in a short time. Also, using cameras to monitor food and water is safer and more reliable than having people do it. We’d like to complete the system this year, and next year, we’d like to produce several sets, so we can make at least a provisional start.”

tags: ,

DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.

Related posts :

World Robotics 2023 report: Asia ahead of Europe and the Americas

The new World Robotics report recorded 553,052 industrial robot installations in factories around the world – a growth rate of 5% in 2022, year-on-year. By region, 73% of all newly deployed robots were installed in Asia, 15% in Europe and 10% in the Americas.

#IROS2023: A glimpse into the next generation of robotics

The 2023 EEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023) kicks off today at the Huntington Place in Detroit, Michigan.
01 October 2023, by

Robot Talk Episode 55 – Sara Adela Abad Guaman

In the first episode of the new season, Claire chatted to Dr. Sara Adela Abad Guaman from University College London about adaptable robots inspired by nature.
30 September 2023, by

A short guide to Multidisciplinary Research

How and Why would I consider colliding two opposite disciplines in my research.
27 September 2023, by

Robo-Insight #5

In this fifth edition, we are excited to feature robot progress in human-robot interaction, agile movement, enhanced training methods, soft robotics, brain surgery, medical navigation, and ecological research. 
25 September 2023, by

Soft robotic tool provides new ‘eyes’ in endovascular surgery

The magnetic device can help visualise and navigate complex and narrow spaces.

©2021 - ROBOTS Association


©2021 - ROBOTS Association