Robohub.org
 

Touch sensing: An important tool for mobile robot navigation

by
29 November 2022



share this:

In mammals, the touch modality develops earlier than the other senses, yet it is a less studied sensory modality than the visual and auditory counterparts. It not only allows environmental interactions, but also, serves as an effective defense mechanism.

Figure 1: Rat using the whiskers to interact with environment via touch

The role of touch in mobile robot navigation has not been explored in detail. However, touch appears to play an important role in obstacle avoidance and pathfinding for mobile robots. Proximal sensing often is a blind spot for most long range sensors such as cameras and lidars for which touch sensors could serve as a complementary modality.

Overall, touch appears to be a promising modality for mobile robot navigation. However, more research is needed to fully understand the role of touch in mobile robot navigation.

Role of touch in nature

The touch modality is paramount for many organisms. It plays an important role in perception, exploration, and navigation. Animals use this mode of navigation extensively to explore their surroundings. Rodents, pinnipeds, cats, dogs, and fish use this mode differently than humans. While humans primarily use touch sense for prehensile manipulation, mammals such as rats and shrews rely on touch sensing for exploration and navigation due to their poor visual system via the vibrissa mechanism. This vibrissa mechanism is essential for short-range sensing, which works in tandem with the visual system.

Artificial touch sensors for robots

Artificial touch sensor design has evolved over the last four decades. However, these sensors are not as widely used in mobile robot systems as cameras and lidars. Mobile robots usually employ these long range sensors, but short range sensing receives relatively less attention.
When designing the artificial touch sensors for mobile robot navigation, we typically draw inspiration from nature, i.e., biological whiskers to derive bio-inspired artificial whiskers. One such early prototype is shown in figure below.

Figure 2: Bioinspired artificial rat whisker array prototype V1.0

However, there is no reason for us to limit the design innovations to 100% accurately mimicking biological whisker-like touch sensors. While some researchers are attempting to perfect the tapering of whiskers [1], we are currently investigating abstract mathematical models that can further inspire a whole array of touch sensors [2].

Challenges with designing touch sensors for robots

There are many challenges when designing touch sensors for mobile robots. One key challenge is the trade-off between weight, size, and power consumption. The power consumption of the sensors can be significant, which can limit their applicability in mobile robot applications.

Another challenge is to find the right trade-off between touch sensitivity and robustness. The sensors need to be sensitive enough to detect small changes in the environment, yet robust enough to handle the dynamic and harsh conditions in most mobile robot applications.

Future directions

There is a need for more systematic studies to understand the role of touch in mobile robot navigation. The current studies are mostly limited to specific applications and scenarios geared towards dexterous manipulation and grasping. We need to understand the challenges and limitations of using touch sensors for mobile robot navigation. We also need to develop more robust and power-efficient touch sensors for mobile robots.
Logistically, another factor that limits the use of touch sensors is the lack of openly available off the shelf touch sensors. Few research groups around the world are working towards their own touch sensor prototype, biomimetic or otherwise, but all such designs are closed and extremely hard to replicate and improve.

References

  1. Williams, Christopher M., and Eric M. Kramer. “The advantages of a tapered whisker.” PLoS one 5.1 (2010): e8806.
  2. Tiwari, Kshitij, et al. “Visibility-Inspired Models of Touch Sensors for Navigation.” 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2022



Kshitij Tiwari is a Postdoctoral Researcher interested in path planning and SLAM for mobile robots and multi-agent systems.
Kshitij Tiwari is a Postdoctoral Researcher interested in path planning and SLAM for mobile robots and multi-agent systems.





Related posts :



Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association