- Model S ∨
- Model 3 ∨
- Model X ∨
- Model Y ∨
- Resources ∨
Radar vs. Lidar: Elon Musk commits with Tesla software update version 8.0
Posted on September 12, 2016 by Matt Pressman
Yesterday, CEO Elon Musk announced some major changes coming with Tesla Motors [NASDAQ: TSLA] forthcoming software update v8.0. Contrary to rumors about LIDAR, Tesla is putting a far greater emphasis on radar moving forward with its Autopilot. Slash Gear* reports, "The radar system was added as a standard-fit component to all Tesla cars manufactured from October 2014 onward, though its role until now has been fairly minor. Indeed, Tesla’s initial expectation was that it would serve as a supplementary sensor, with the primary camera and corresponding image processing system still the central source of Autopilot data." This all changes with software update version 8.0.
Image: Inside EVs
In fact, v8.0 will put radar front-and-center. And this was previously foreshadowed as, "Musk teased that new implementations of the radar could, in fact, see the sensor take a far more prominent role. That refinement would help avoid the significant expense of LIDAR laser scanners, which have been widely used on autonomous vehicle prototypes but are currently prohibitively costly for production cars." To that end, there were rumors that Tesla may move towards LIDAR as spy shots recently revealed Tesla Model S vehicles with a roof-mounted LIDAR sensors spotted near the company’s Palo Alto-based HQ (headquarters).
Above: Misleading Tesla LIDAR spy shots — Top: Two puck-style LIDAR sensors (possibly Velodyne) mounted on the front and rear of a Model S near HQ last week; Bottom: the clunky roof-mounted LIDAR device more commonly seen with Google's self-driving test vehicles on a Model S also seen near Tesla's HQ in July (Source: Teslarati - top, bottom)
However, Musk is committing to radar, not LIDAR, as: "Autopilot version 8, though, will do something similar – and, indeed, Musk argues better – to LIDAR only with far more affordable radar." Regardless of cost, Musk elaborates on LIDAR's limitations: "We do not envisage using LIDAR... LIDAR does not penetrate occlusions, it does not penetrate rain, dust, or snow, whereas radar does. LIDAR doesn’t bounce, you can’t look at things in front of the car in front of you." Radar, on the other hand, has these inherent (and significant) advantages.
Image: Inside EVs
That said, the real question for software update v8.0 should've been: radar vs. camera? Currently, radar takes a backseat to Tesla's front-facing camera and image processing capabilities. With the forthcoming v8.0, radar will now become the primary player for Autopilot. Why wasn't this the case before? "Using radar in this new way is tricky, Tesla highlights, because it can be easy to trigger a false-positive for an obstacle ahead. People are seen, though only as translucent objects; wood or painted plastic objects aren’t seen at all. In contrast, metal objects are visible, but how they’re seen by radar sensors is heavily impacted by their shape. Most notably, if they’re concave that will amplify the returned signal from the radar, making any object look much bigger than it actually is."
Image: Business Insider via Tesla
Musk pointed out an example of this: "even the base of a soda can on the roadway could look so huge to traditional radar that, if the car was relying on that sensor alone, emergency braking could be triggered." So how will Tesla transition Autopilot to emphasize radar in its more critical role? "Tesla’s implementation of radar will roll out progressively. First up is an eight-fold increase in information sourced from the radar sensor, building up a far more accurate point cloud of the environment around the car. By combining those snapshots, taking once every tenth of a second, a 3D model can be created: Tesla’s software then looks for movement within that model, to help ascertain whether something really does present a possible accident risk."
Image: Slash Gear* via Tesla
But how does this work in the real world? It's complicated, "where there are dips or rises that bring road surface or overhead highway signs into the radar’s path. Since GPS data isn’t sufficiently detailed to include those changes in orientation, Tesla has to supplement it with other information. As every Model S and Model X is connected to Tesla’s cloud, fleet learning can fill in that gap. The first stage of Autopilot version 8’s implementation will be for cars to spot and record any road signs, bridges, and other stationary objects to build up a real-world map that’s shared by all vehicles."
Big picture — how will this impact Tesla Autopilot safety? Electrek reports, "Elon Musk sees a 3x potential increase in safety. Musk said that the data already suggests a 50% reduction in the probability of having an accident when using Autopilot versus manual driving, but the new radar processing technology could do much better over time with Tesla’s fleet learning capability... [and] Tesla now adds ~1.5 million Autopilot miles per day. The new data added to Tesla’s fleet learning capacity every day is what will enable a 3x improvement in safety, according to Musk."
*Source: Slash Gear