Robohub.org
 

Is Apple building a robocar? Maybe. Or maybe not.


by
05 February 2015



share this:

Dodge caravan self driving car in brooklyn new york_Apple_robocar?There is great buzz about some sensor-laden vehicles being driven around the USA, which have been discovered to be owned by Apple. The vehicles have cameras, LIDARs, and GPS antennas, and many are wondering is this an Apple Self-Driving Car? See also speculation from cult of Mac.

Here’s a video of the vehicle driving around the East Bay (50 miles from Cupertino), but they have also been seen in New York.

We don’t see the front of the vehicle, but it sure has plenty of sensors. On the front and back you can see two Velodyne 32E Lidars. These are 32 plane LIDARS that cost about $30K. You can also see two GPS antennas and what appears to be cameras in all directions. Unfortunately, you can’t see the front of the vehicle in these pictures, which is where the most interesting sensors will be.

So is this a robocar, or is this just a fancy mapping car? Rumours about Apple working on a car have been swirling for a while, but the absence of sightings of cars like this call them into question. You can’t have an active program without also testing cars on roads. There are ways to hide LIDARS and even cameras to a degree (and Apple is super secretive, so they might), but this vehicle hides little.

Most curious are the Velodynes. They are tilted down significantly. The 32E unit sees from about 10 degrees up to 30 degrees down. Tilting them this much means you don’t see out horizontally, which is not at all what you want if this is for a self-driving car. These LIDARs are densely scanning the road area close to the car, and higher items in the opposite direction. The rear LIDAR will be seeing out horizontally, but it’s placed just where you wouldn’t place it to see what’s in front of you. A GPS antenna is blocking the direct forward view, so if the goal of the rear LIDAR is to see ahead, it makes no sense.

We don’t see the front, so there might be another LIDAR up there, along with radars (often hidden in the grille) and these would be pretty important for any research car.

For mapping, these strange angles and blind spots are not an issue. You are trying to build a 3D and visible light scan of the world. What you don’t see from one point, you might see from another. For street mapping, what’s directly in front and behind is generally the road, and not especially interesting. But what’s to the side could be really interesting.

The car also has an accurate encoder on its wheel to give improved odemetry. Both robocars and mapping cars are interested in precise position information.

Evidence that this is a robocar:

  • The Velodynes are expensive, high-end and more than you need for mapping (though if cost is no object, they are a decent choice).
  • Apple knows it’s being watched, and might try to make their robocar look like a mapping car.
  • There are other sensors that we can’t see

Evidence that this is a mapping car:

  • As noted, the Velodynes are titled in a way that really suggests mapping. (Ford uses tilted ones, but paired with horizontal ones.)
  • The cameras are aimed at the corners, not forward.
  • They are driving in remote locations, which eventually you want to do, but initially you are more likely to get to the first stage close to home. Google has not done serious testing outside the Bay Area in spite of their large project.
  • The lack of streetview is a major advantage Google has over Apple, so it is not surprising they might make their own.

I can’t make a firm conclusion, but the evidence so far leans toward it being a mapping car. Seeing the front (which I am sure will happen soon) will tell us more. Another option is that it could be a mapping car building advanced maps for a different, secret, self-driving car.



tags: , ,


Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.
Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.


Subscribe to Robohub newsletter on substack



Related posts :

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.

Translating music into light and motion with robots

  25 Feb 2026
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music

Robot Talk Episode 145 – Robotics and automation in manufacturing, with Agata Suwala

  20 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Agata Suwala from the Manufacturing Technology Centre about leveraging robotics to make manufacturing systems more sustainable.

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence