16 Jan NVIDIA Resets The Self-Driving Bar Again With AutoPilot Supporting ‘Level 2+’
Last week I attended CES 2019 in Las Vegas and took in all the consumer electronics news I possibly could, from all the leading companies in the consumer electronics, software and semiconductor sectors. I am in what some would call the “recovery phase” of CES. Days started at 6AM and ended around midnight in the city that should have been called the “city that does not sleep.” GPU and deep learning-powerhouse NVIDIA is one company that has been a big part of the CES conversation for years given its leadership in gaming and autonomous driving so let’s take a look at what the company announced at CES 2019. Moor Insights & Strategy analyst Anshel Sag will be doing a deeper dive on NVIDIA’s gaming-related announcements.
NVIDIA’s first big announcement from the conference was the unveiling of DRIVE AutoPilot, which the company calls “the world’s first commercially available Level 2+ automated driving system,” which I believe is an accurate description. This will likely be very consumer impactful, given DRIVE AutoPilot’s cutting-edge AI technologies will allow supervised self-driving vehicles to go into production in 2020.
Level 2 today encompasses many of the ADAS systems out there like Volvo’s Pilot Assist and Tesla’s AutoPilot. L2 ADAS systems can control steering, accelerating and braking at the same time but unlike L3, L2 drivers still need to keep their hands on the wheel. NVIDIA’s AutoPilot is an L2 “+” system. The “plus” enables features like Volvo’s upcoming 360 degree surround perception and driver monitoring systems, both which require deep learning.
DRIVE AutoPilot utilizes NVIDIA’s Xavier SoC processor and DRIVE software to process numerous deep neural networks for a variety of self-driving needs, such as perception and self-driving autopilot capabilities like highway merge, lane change, and navigation. NVIDIA says Xavier delivers 30 trillion operations per second of processing capability, all while using only 30 watts of energy. This performance level opens up a much broader range of automated driving features, with smart cockpit assistance and visualization capabilities that blow the current ADAS offerings out of the water.
The new system is enhanced both inside and outside the vehicle. For challenges outside the vehicle—recognizing pedestrians, traffic signs, lane markings, traffic lights, etc.—DRIVE AutoPilot utilizes the DRIVE AV software stack. For tasks inside the car, such as monitoring drivers for distraction or drowsiness, or providing visualization with augmented reality, DRIVE AutoPilot utilizes the DRIVE IX software stack.
“You will not get cockpit AI functionality with a run of the mill L2 system.”
NVIDIA says this is the first time that DRIVE IX and DRIVE AV have been combined into one system enabling the combined inside and outside functionality. Another key part of this is that DRIVE AutoPilot is an “open” platform, allowing developers to use all or just part of the DRIVE software stacks. Don’t confuse “open” with “open source” however.
Also significant is that DRIVE AutoPilot should make it easier for manufacturers to graduate to higher levels of autonomy beyond 2+, since the DRIVE platform has a unified architecture, the hardware upgraded (through the manufacturer) to the much more powerful AGX Pegasus (capable of a whopping 320 trillion ops per second). NVIDIA announced at the event that several of its partners are already developing solutions utilizing DRIVE Xavier processors and DRIVE software. ZF unveiled its ProAI scalable autonomous driving solution, set for production in 2020, which it says provides “a unique modular hardware concept and open software architecture.” Additionally, Continental announced that its own production-level automated driving architecture would begin production in 2020, powered by NVIDIA DRIVE.
DRIVE AutoPilot will enable NVIDIA to leverage all the work it has been doing on the higher levels of autonomy, and bring it to market sooner. I count this as an expansion of itsmarket, as it now have a very viable offering in this more entry segment, and the industry will benefit from the added safety and convenience features that Level 2+ offers over currently available ADAS offerings.
Another big piece of NVIDIA auto-related news from CES was that Mercedes-Benz said it has chosen NVIDIA as its AI partner for developing next-generation vehicles. Together, the companies say, they will work to create a single computer architecture that provides smart-cockpit functions and self-driving capabilities that will eventually supplant the many smaller, separate processors inside current vehicles. NVIDIA and Mercedes-Benz say they have a shared vision that the future of autonomous driving will software-defined. There was a lot of “motherhood and apple pie” in this announcement but let me look under the proverbial hood and unpack this.
This is interesting for a few reasons. First off, it appears Mercedes is headed down an “ECU consolidation” path, opting for more centralized compute and control versus having disparate ECUs for different functions like power windows, cockpit, braking, infotainment, etc. The plan is to virtualize these functions like a datacenter virtualizes applications and with built-in safety redundancy. As we saw with datacenter virtualization, I am expecting much higher performance and efficiency and if Mercedes over-provisions resources, it would have room to upgrade and add functions down the road. Just like in a datacenter. Also interesting to me was the cockpit mention, which, as you remember, was NVIDIA’s first automotive foray before moving most of those resources to self-driving. So the way I see it, NVIDIA started in the cockpit, moved to safety and self-driving, and is circling back to include cockpit again, but now based on AI. What’s old is new again.
RTX 2060 and new laptop models
While Anshel will dive deeper into gaming, I wanted to touch into the RTX 2060. NVIDIA spent pretty much all its two-hour keynote time to announce the new GeForce RTX 2060 GPU, based on the same Turing architecture as the 2080 and 2080 TI. At a $349 price point, the 2060, based on reviews show the new card is 60% faster than the GTX 1060 on current titles and will bring levels of performance and features that were previously relegated to higher-end RTX GPUs like AI and ray tracing.
The 2060 features 240 Tensor Cores and 6GB of GDDR6 memory, and, with 52 teraflops of deep learning horsepower, is capable of DLSS (Deep Learning Super Sampling) and ray tracing acceleration. NVIDIA also announced that the new 2060 will hit the market today and later on in new laptop models from the likes of Acer, Alienware, Dell, HP, Lenovo, and more. Additionally, NVIDIA announced that a record number—over 40—new gaming laptops with GeForce RTX GPUs (2080 through 2060) will be hitting the market this month, in over 100 different configurations.
The RTX 2060 is the volume ray tracing card and as NVIDA CEO Jensen Huang said in our media Q&A, “this (the 2060) is where everything comes together.” Jensen conceded it was early with the 2080 but the 2060 is the right product at the right price with the right DLSS and ray tracing content. It was a good day for gamers.
NVIDIA always goes big at CES and CES 2019 was no different. While automotive was front and center the last three CES keynotes, Jensen pulled up gaming and moved automotive to day two and three. I am not putting a lot of significance into that move other than the company felt it needed to flex its gaming muscles more at CES, a move I think made investors and enthusiasts happier.
With the announcement of DRIVE AutoPilot, it continues its push into the automotive maturing market and with it, a degree of L2 differentiation and upgradability. NVIDIA’s partnerships and commitments from the likes of ZF, Continental, and Mercedes show it is gaining significant traction in this space and keeping it. Beyond automotive, NVIDIA continues to raise the bar with new gaming GPUs like the GeForce RTX 2060. NVIDIA’s degree of success with the 2060 will be directly related to the value gamers place on ray tracing and DLSS acceleration, which is getting better every week.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.