Robohub.org
 

Extreme-environment robotics under development at Keio University


by
20 May 2014



share this:
Keio_university

At Keio University in Japan, the Ishigami Laboratory, in the Faculty of Science and Technology, Department of Mechanical Engineering, is investigating robotic mobility systems. The main mission of this group is to perform fundamental and applied research for application to extreme environments, notably lunar and planetary rovers.

“In our lab, we focus on field robotics that works for extreme environments.”

“For example, we investigate the interaction mechanics between robots and sandy surfaces, taking into account “off-the-road locomotion.”Also, because such robots would be deployed in unknown environments, we also work on vision systems such as cameras and laser rangefinders.”

In this research, there are three key concepts: vehicle-terrain interaction mechanics, autonomous mobility systems, and robotic device development.

In vehicle-terrain interaction mechanics, the researchers analyze vehicle behavior using a dynamic simulator. They’re also developing vehicle-slip compensation systems and in-wheel-sensor systems.

“In the study of interaction mechanics, we first focus on a wheel itself using a “single-wheel testbed.” We put just one wheel on the testbed, and perform experimental runs to obtain wheel force data under different sets of slip parameters. Meanwhile, we numerically calculate wheel force based on a wheel-sand interaction model we developed. Then, we compare the experimental results with the numerical ones, so we can evaluate how valid the interaction model is. Applying this approach to a whole robot-vehicle system, it is possible to simulate how the robot behaves dynamically in an unknown environment. That’s the key approach in this research.”

“Sand flow investigation has received especially close attention in recent years. In our lab, of course, we’ve recently taken such an approach, called particle image velocimetry, or PIV, which has been widely used in fluid mechanics. PIV enables us to clearly determine the sand flow, helping to develop a well-defined interaction model.”

In the area of autonomous mobility systems, the Ishigami Lab is working on environment recognition using laser rangefinders and camera images, as well as robotlocalization, path planning, teleoperation, and integrated sensory processing systems.

“For example, in an unknown environment, there aren’t any road signs, saying ‘there’s an obstacle here,’ or ‘turn right at the next intersection.’ Such obstacles should be detected by onboard cameras, or laser rangefinders which operate based on the time-of-flight principle (measuring the time from a laser emission to detection of the reflected laser.). In our research, we effectively utilize such devices to obtain 3D distance data or 3D environment information. Based on these data, the robot itself decides how to travel. Such systems are called autonomous mobility systems.”

“One typical point of our lab is, I would say, we’re focusing on mechanics as well as autonomous mobility, applying both hardware and software approaches.. In general, one lab has one specific point of interest for research, and looks more deeply into that, but in our lab, we work on mechanics and also on autonomous mobility systems, so we pursue several topics in parallel. Robots consist of integrated technology, so we consider them as total systems.
In addition, another feature of our research is, we consider that field tests are extremely important. We actually take our robots to outdoor environments such as volcanic regions on Izu Oshima and Mt. Aso, and operate them in rough terrain, to test how they act in actual environments.”

“The field of robotics comprises a variety of technologies. So, rather than sticking to a single academic discipline, we’d like students to do research from a broad perspective.”



tags: , , ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.


Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 149 – Robot safety and security, with Krystal Mattich

  20 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Krystal Mattich from Brain Corp about trustworthy autonomous robots in public spaces.

A multi-armed robot for assisting with agricultural tasks

  18 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.

Restoring surgeons’ sense of touch with robotic fingertips

  10 Mar 2026
Researchers are developing robotic “fingertips” that could give surgeons back their sense of touch during minimally invasive and robotic operations.

Robot Talk Episode 147 – Miniature living robots, with Maria Guix

  06 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties.

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence