Robohub.org
 

Self-driving cars for country roads


by
07 May 2018



share this:


A team of MIT researchers tested MapLite on a Toyota Prius outfitted with a range of LIDAR and IMU sensors.
Photo courtesy of CSAIL.


By Adam Conner-Simons | Rachel Gordon

Uber’s recent self-driving car fatality underscores the fact that the technology is still not ready for widespread adoption. The reality is that there aren’t many places where today’s self-driving cars can actually reliably drive. Companies like Google only test their fleets in major cities, where they’ve spent countless hours meticulously labeling the exact 3-D positions of lanes, curbs, and stop signs.

“The cars use these maps to know where they are and what to do in the presence of new obstacles like pedestrians and other cars,” says Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). “The need for dense 3-D maps limits the places where self-driving cars can operate.”

Indeed, if you live along the millions of miles of U.S. roads that are unpaved, unlit, or unreliably marked, you’re out of luck. Such streets are often much more complicated to map, and get a lot less traffic, so companies aren’t incentivized to develop 3-D maps for them anytime soon. From California’s Mojave Desert to Vermont’s White Mountains, there are huge swaths of America that self-driving cars simply aren’t ready for.

One way around this is to create systems advanced enough to navigate without these maps. In an important first step, Rus and colleagues at CSAIL have developed MapLite, a framework that allows self-driving cars to drive on roads they’ve never been on before without 3-D maps.

MapLite combines simple GPS data that you’d find on Google Maps with a series of sensors that observe the road conditions. In tandem, these two elements allowed the team to autonomously drive on multiple unpaved country roads in Devens, Massachusetts, and reliably detect the road more than 100 feet in advance. (As part of a collaboration with the Toyota Research Institute, researchers used a Toyota Prius that they outfitted with a range of LIDAR and IMU sensors.)

“The reason this kind of ‘map-less’ approach hasn’t really been done before is because it is generally much harder to reach the same accuracy and reliability as with detailed maps,” says CSAIL graduate student Teddy Ort, who was a lead author on a related paper about the system. “A system like this that can navigate just with on-board sensors shows the potential of self-driving cars being able to actually handle roads beyond the small number that tech companies have mapped.”

The paper, which will be presented in May at the International Conference on Robotics and Automation (ICRA) in Brisbane, Australia, was co-written by Ort, Rus, and PhD graduate Liam Paull, who is now an assistant professor at the University of Montreal.

For all the progress that has been made with self-driving cars, their navigation skills still pale in comparison to humans’. Consider how you yourself get around: If you’re trying to get to a specific location, you probably plug an address into your phone and then consult it occasionally along the way, like when you approach intersections or highway exits.

However, if you were to move through the world like most self-driving cars, you’d essentially be staring at your phone the whole time you’re walking. Existing systems still rely heavily on maps, only using sensors and vision algorithms to avoid dynamic objects like pedestrians and other cars.

In contrast, MapLite uses sensors for all aspects of navigation, relying on GPS data only to obtain a rough estimate of the car’s location. The system first sets both a final destination and what researchers call a “local navigation goal,” which has to be within view of the car. Its perception sensors then generate a path to get to that point, using LIDAR to estimate the location of the road’s edges. MapLite can do this without physical road markings by making basic assumptions about how the road will be relatively more flat than the surrounding areas.

“Our minimalist approach to mapping enables autonomous driving on country roads using local appearance and semantic features such as the presence of a parking spot or a side road,” says Rus.

The team developed a system of models that are “parameterized,” which means that they describe multiple situations that are somewhat similar. For example, one model might be broad enough to determine what to do at intersections, or what to do on a specific type of road.

MapLite differs from other map-less driving approaches that rely more on machine learning by training on data from one set of roads and then being tested on other ones.

“At the end of the day we want to be able to ask the car questions like ‘how many roads are merging at this intersection?’” says Ort. “By using modeling techniques, if the system doesn’t work or is involved in an accident, we can better understand why.”

MapLite still has some limitations. For example, it isn’t yet reliable enough for mountain roads, since it doesn’t account for dramatic changes in elevation. As a next step, the team hopes to expand the variety of roads that the vehicle can handle. Ultimately they aspire to have their system reach comparable levels of performance and reliability as mapped systems but with a much wider range.

“I imagine that the self-driving cars of the future will always make some use of 3-D maps in urban areas,” says Ort. “But when called upon to take a trip off the beaten path, these vehicles will need to be as good as humans at driving on unfamiliar roads they have never seen before. We hope our work is a step in that direction.”

This project was supported, in part, by the National Science Foundation and the Toyota Research Initiative.




MIT News





Related posts :



Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.

Robot Talk Episode 107 – Animal-inspired robot movement, with Robert Siddall

  31 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Robert Siddall from the University of Surrey about novel robot designs inspired by the way real animals move.

Robot Talk Episode 106 – The future of intelligent systems, with Didem Gurdur Broo

  24 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Didem Gurdur Broo from Uppsala University about how to shape the future of robotics, autonomous vehicles, and industrial automation.

Robot Talk Episode 105 – Working with robots in industry, with Gianmarco Pisanelli 

  17 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gianmarco Pisanelli from the Advanced Manufacturing Research Centre about how to promote the safe and intuitive use of robots in manufacturing.

Robot Talk Episode 104 – Robot swarms inspired by nature, with Kirstin Petersen

  10 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kirstin Petersen from Cornell University about how robots can work together to achieve complex behaviours.

Robot Talk Episode 103 – Delivering medicine by drone, with Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.

Robot Talk Episode 102 – Soft robots inspired by plants, with Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association