news    views    podcast    learn    |    about    contribute     republish    

navigation

by   -   September 29, 2014

A few weeks ago, Pi Robot and I joined the Silicon Valley ROS Users Group (SV-ROS) to help with the effort (already under way) to prepare for a challenging robot navigation contest held at the end of this year’s IROS Conference in Chicago.

by   -   April 25, 2013

ShanghAIGlobeColorSmall

In the 9th part of the ShanghAI Lecture series, we look at ontogenetic development as Rolf Pfeifer talks about the path from locomotion to cognition. This is followed by two guest lectures: The first one by Ning Lan (Shanghai Jiao Tong University, China) on cortico-muscular communication in the nervous system, the second by Roland Siegwart (ETH Zurich) on the design and navigation of robots with various moving abilities.

by   -   August 1, 2012


Last week, SenseFly and Pix4D announced deals with drone maker Parrot, in which Parrot will invest in both companies, 5 million Swiss francs in SenseFly and 2.4 million in Pix4D. Both spinoffs of EPFL, SenseFly and Pix4D have a history of cooperation, with SenseFly providing the camera-equipped UAVs for which they have also developed navigational software that allows them to fly complete missions autonomously, and Pix4D providing the software that transforms the thousands of images produced by the drones into unified geographical information. (Kudos to Engadget for their prompt reportage.)

by   -   August 24, 2010

How can a robot, using vision, go back to a previously visited location?

Möller et al. look at this research question, tagged “Local Visual Homing” in an intuitive manner inspired from social insects returning to their nest. The idea is that a robot, when somewhere important, takes a snapshot of the surrounding visual information. To return to that location later on (homing), it compares its current view of the world with the stored snapshot.

A technique called “image warping” is used to guide the robot to the snapshot location. Simply put, the robot imagines all possible movements it can do and simulates their effect on its current view of the world. It then selects the action that would bring its view closest to the stored snapshot. The outcome of this method is a homing vector that the robot should follow and a measure of how much its orientation has changed.

Using three different implementations of image warping, Möller et al. show how a robot equipped with a panoramic camera could effectively home with reasonable computational effort. Experiments were conducted on a database of real-world images taken by a robot (see example images below).

In the future, robots could use visual homing to go from snapshot to snapshot, thereby navigating through large environments.

Finally, don’t miss the author’s website for an extensive overview of visual navigation techniques.

by   -   July 20, 2010

Robots often need to know where they are in the world to navigate efficiently. One of the cheapest ways to localize is to strap a camera on-board and extract visual features from the environment. However, challenges arise when robots move fast enough to create motion blur. The problem is that blurry images lead to decreased accuracy in localization. Because of this, robots that move too fast might no longer be able to localize and as a result might get lost or need to stop and re-localize.

Instead, Hornung et al. propose to use reinforcement learning to determine the optimal policy which allows the robots to go at speeds appropriate for navigation while ensuring that they get to destination as fast as possible. The actual implementation uses an augmented Markov decision process (MDP) to model the navigation task.

The learned policy is then compressed using a clustering technique to avoid being memory-sassy, which would be a major limitation for robots with low storage capacity.

Experiments were successfully conducted on two different robots in indoor and outdoor scenarios (see video) and the robots were faster than if they had navigated at constant speed. In the future, Hornung et al. hope to implement their system on fast moving robots, such as unmanned aerial vehicles!



Robot Operating System (ROS) & Gazebo
August 6, 2019


Are you planning to crowdfund your robot startup?

Need help spreading the word?

Join the Robohub crowdfunding page and increase the visibility of your campaign