Robohub.org
 

Laboratory animal management robot can care for 30,000 mice


by
17 July 2013



share this:
13-0053-r

This robot is being developed to automate the management of laboratory animal colonies used by pharmaceutical companies and research institutions, primarily those that raise from 10,000 to 30,000 mice or rats. It’s currently under development by Nikkyo Technos and Yaskawa Electric.

“The biggest problem in raising animals is that diseases can spread from people to the animals. If that happens, all the animals have to be killed and replaced with new ones. So, infection by people must be prevented. By managing animals using robots in an enclosed space, it’s basically possible to eliminate the spread of diseases from animals to people or from people to animals.”

This six-axis, vertical, multi-jointed robot can mimic the motions of a human. It can change cage sheets, top up the food, and change the water. Taking out cages, changing sheets, and topping up food are each done with separate tools, which the robot picks up in turn. In this model, the amount of food remaining isn’t taken into account. But in the next model, a camera will be used to see how much food is left, so the robot can add the right amount.

“Mice, especially, are nervous animals, so the robot handles the cages gently. These tasks account for about 80% of the work involved with lab animals. So, our aim is to automate the hard, dirty, and dangerous task of dealing with so much dust and droppings.”

This robot can also work in coordination with a robot that carries cages from the rack to the workbench, and a monitoring system for the animal facility. In this way, it can automate all aspects of animal raising, from surveillance to care.

“The animals can be monitored with cameras 24/7. So, people can check their own cages from the monitoring station. The animals’ body temperature can also be managed. It takes about two hours for the cages to come back from the lab, but data can be viewed directly from a PC in the monitoring room. So, people can see the cages they want right away, wherever they are.”

“Doing this work with robots makes it much faster, so lots of cages can be handled in a short time. Also, using cameras to monitor food and water is safer and more reliable than having people do it. We’d like to complete the system this year, and next year, we’d like to produce several sets, so we can make at least a provisional start.”



tags: ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Sony AI table tennis robot outplays elite human players

  22 Apr 2026
New robot and AI system has beaten professional and elite table tennis players.

AI system learns to keep warehouse robot traffic running smoothly

  20 Apr 2026
This new approach adapts to decide which robots should get the right of way at every moment, avoiding congestion and increasing throughput.

Robot Talk Episode 152 – Dexterous robot hands, with Rich Walker

  17 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Rich Walker from Shadow Robot Company about their advanced robotic hands for research and industry.

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

and   14 Apr 2026
Ross King created the first robot scientist back in 2009. He spoke to us about the nature of scientific discovery, the role AI has to play, and his recent work in DNA computing.

Robot Talk Episode 151 – Robots to study the ocean, with Simona Aracri

  10 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Simona Aracri from National Research Council of Italy about innovative robot designs for oceanography and environmental monitoring.

Generative AI improves a wireless vision system that sees through obstructions

  08 Apr 2026
With this new technique, a robot could more accurately detect hidden objects or understand an indoor scene using reflected Wi-Fi signals.

Resource-constrained image generation and visual understanding: an interview with Aniket Roy

  07 Apr 2026
Aniket tells us about his research exploring how modern generative models can be adapted to operate efficiently while maintaining strong performance.

Back to school: robots learn from factory workers

  02 Apr 2026
A Czech startup is making factory automation easier by letting workers teach robots new tasks through simple demonstrations instead of complex coding.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence