Tesla Autopilot is artificial intelligence and evidence of a HUD [Video]

Guest Blog Post: Trevor Page is the founder of the Model 3 Owners Club online forum, a site dedicated to Tesla, Model 3 reservation holders and eventual owners. He also has a popular YouTube channel with discussion of all things Tesla and EVs. You can also follow him on Twitter.

In case you might have missed it, a few days ago Tesla Motors [NASDAQ: TSLA] released a new video (featured below) with a Model X equipped with autopilot 2.0 navigating around an urban environment and finally ending up at Tesla headquarters. Most importantly, this video now shows what the new NVIDIA Drive PX 2 supercomputer is actually seeing. Obstacle, pedestrian, street signs and lights, and drivable surfaces are detected in real time.

Image: Teslarati

This is accomplished by the use of two extremely high-powered graphics processing units (GPUs) and the use of deep learning neural networking, basically narrow artificial intelligence. This branch of computer science, while not really new, is benefitting from external advances in computing power and targeted at pattern recognition. Real-world examples of this technology exists in your phone: Siri and Google Now already use deep learning for speech recognition. Google Photos also uses image pattern recognition to instantly find (and recognize) images ranging from a cat to a car that you photographed on your phone. This last example, however, is exactly the application that Tesla is using for Autopilot.


Image: NVIDIA

NVIDIA has developed a complete stack (hardware, AI, development kit) that allows for fully autonomous driving and Tesla is employing it — surely along with some Tesla “secret sauce” — to take their cars to full level 5 autonomy. This is a critical part that allows Tesla to bring their car sharing plan from the Master Plan Part 2 to market sooner than expected.


Image: NVIDIA

By feeding NVIDIA DGX-1 supercomputers with enormous amounts of data sets to train the system, the system can develop an AI model that can be downloaded over-the-air to any Tesla equipped with the DRIVE PX 2 supercomputer.  The cars can then also do their own analysis and upload this data over-the-air back to Tesla headquarters to further improve the system in a cycled loop. The system actually gets better and smarter over time.

 

Youtube: Tesla

I think this is precisely why Tesla opted to deliver the hardware now in every Model S and Model X built since the beginning of November: to allow the AI model to run in “shadow mode” in order to learn from real-world driving conditions and help to improve the system. This is a very different approach from others such as Google and Volvo who have stated they only want to deliver something close to perfect. Elon has made it very clear that using the "perfect" approach is contrary to trying to make cars safer.


Image: NVIDIA

During my research into the DRIVE PX 2 computer, which is actually the second generation, I also came across a complimentary sub-system called the DRIVE CX which is a self-contained digital cockpit computer. It can drive up to 16.8 million pixels on up to three displays. NVIDIA again provides a full SDK to allow automakers to create realistic renderings of what the DRIVE PX system “sees” as well as create all kinds of interfaces for a digital cockpit. We all know Tesla cars already have a digital cockpit but those were developed “in-house” by Tesla using much older Tegra 2 and 3 chips. Going forward it would make sense for them to also adopt the new DRIVE CX system to deliver a new method to enhance autopilot information display, especially given that the Model 3 will only come with a single horizontal display. Where will this information be displayed?

A case for an augmented-reality HUD

Since the Model 3 was first revealed, it’s been speculated that a HUD (heads up display) is the only thing that makes sense given the lack of a center instrument cluster. However recent hires by Tesla like Milan Kovac of SKULLY, a maker of motorcycle helmets with an integrated HUD, and Félix Godard who was an interior designer of the Porsche Mission E concept car (which has holographic displays), points to Tesla going the extra mile to offer a completely digital cockpit of the future. Elon’s Tweet about the control systems of the Model 3 that “look like a spaceship” are starting to become more clear.

Twitter: Elon Musk

Which type of HUD Tesla might offer on the Model 3 (and other cars) is still unknown. Two types of HUDs are available on the market: a combiner HUD which uses mirrors to project information onto a clear glass or plastic sheet and a windshield HUD which can project directly onto the windshield. The latter takes up more space in the dash because of the need for optics to correct for the curvature of the windshield and also requires special coatings and anti-reflective measures to prevent interior reflections and glare from obscuring the HUD projection. Maybe the new Tesla Glass that Elon mentioned for the Model 3 might have some sort of solution for this? 

 

Image: Continental

Given that Tesla's Model S and X are the technology leaders with higher profit margins, we could very well see this technology making it to those cars first but one cannot khow (for sure) until the final reveal of the Model 3. Maybe it won’t even make the first cut for the Model 3. Maybe it could even be an optional upgrade? Hard to say. But, we should know more in the spring once the final Model 3 reveal happens.

Video Overview

 

Youtube: Model 3 Owners Club