NVIDIA DRIVE DEVELOPER FAQ
NVIDIA DRIVE Hyperion? is an AV development platform and reference architecture for developing Level 2+ and Level 3 highway autonomous solutions. It consists of a complete sensor suite that’s tuned, optimized, and safety-certified, as well as a high-performance AI computing platform, NVIDIA DRIVE AGX?.
The NVIDIA DRIVE AGX? Developer Kit provides the hardware, software, and sample applications needed for the development of production-level AVs. The DRIVE AGX system is built on production, auto-grade silicon and engineered with security in mind, featuring an open software framework.
The NVIDIA DRIVE Software Development Kit (SDK) is a collection of software packages used for developing autonomous vehicles. It comprises the foundational NVIDIA DRIVE? OS and DriveWorks SDK as well as advanced applications such as highly automated supervised driving (DRIVE AV) and AI cockpit (DRIVE IX).
NVIDIA DRIVE OS is a foundational software stack consisting of an embedded Real Time OS (RTOS), hypervisor, NVIDIA DriveWorks, NVIDIA? CUDA? libraries, NVIDIA TensorRT? , and other modules that give you access to the hardware engines. DRIVE OS offers a safe and secure execution environment for applications such as secure boot, security services, firewall, and over-the-air updates. Plus, it offers a real-time environment with an RTOS and a hypervisor for Quality of Service (QoS). The RTOS, AUTOSAR, and hypervisor are ASIL-D components.
NVIDIA DriveWorks is a middleware framework that consists of a library of software modules, sample applications, and tools to enable software development for autonomous vehicles. It leverages the computing power of the DRIVE AGX platforms and is designed to be open, modular, and compliant with automotive industry software standards such as ISO26262 and MISRA-C. DriveWorks is available with DRIVE OS releases.
NVIDIA DRIVE IX is an SDK used for developing in-vehicle applications for safety, comfort, and enhancing the user experience. For more information on how to get started, refer to our DRIVE IX developer page.
The DRIVE AGX Xavier and Pegasus? XT Developer Kits are available directly from NVIDIA and through authorized sublease partners. Software releases and materials are published in the NVIDIA DRIVE AGX SDK Developer Program. Your company or university must have current legal agreements on file. Please contact your Account Representative (or contact us) to ensure necessary agreements have been signed before requesting to join the NVIDIA DRIVE AGX SDK Developer Program. Users may only join with an approved corporate or university email address.
The NVIDIA DRIVE AGX Orin? Developer Kit will be available directly from NVIDIA or through authorized distributors. Users must register their DRIVE AGX Orin Developer Kit to get access to the NVIDIA DRIVE AGX SDK Developer Program where software releases and materials will be published. Users may only join with an approved corporate or university email address.
If you’ve received your NVIDIA DRIVE AGX DevKit, congratulations! The Onboarding Guide provides an overview of all resources relevant to help you start your development.
The DRIVE AGX platforms are compatible with a number of sensors and peripherals from different vendors. A full list is available on the DRIVE Ecosystem Hardware and Software Components.
While both the Jetson AGX and DRIVE AGX kits use the same Orin SoC, DRIVE is built for automotive applications. The following table highlights the differences between the two platforms.
10GbE RJ45? 100 T1, 1000 T1, 10 GbE T1 ? CAN, LIN, Flexray, USS | 10GbE RJ45? M.2 Key E (WLAN / BT, PCIe, USB 2.0, UART, I2S & I2C) | |||
MicroSD? M.2 Key M (NVMe)? | ||||
?Comprehensive safety & security services: Functional safety island integration, streamlined safety framework, platform security controller integration, secure boot & PKCS #11 support for crypto acceleration features? | Secureboot, trusted execution environment, disk and dram encryption, crypto acceleration, hardware-based security fuses? | |||
? NvMedia, NvStreams, CUDA, cuDNN, TensorRT?, Triton inference server, NVIDIA container runtime ? Tools: Nsight Systems (performance analysis), Nsight Graphics (debug, profiler and export with Direct3D), Nsight Eclipse Edition (IDE), Nsight Compute (interactive kernel profiler for CUDA applications)? | ? Tools: Nsight Systems (performance analysis), Nsight Graphics (debug, profiler, and export with Direct3D), Nsight Eclipse Edition (IDE), Nsight Compute (interactive kernel profiler for CUDA applications)? | |||
* Intended for production platform only
** Need to choose required compute engines to run application and stay within power budget
*** Bumped performance up on Orin SoC (not auto grade) GPU & DLA
1GbE (x10)? 10GbE (x2)? | 1GbE (x7)? 10GbE (x2) | |
TensorRT 8.x? cuDNN 8.x? | TensorRT 6.4? cuDNN 7.6? | |
USB3.2 | 2 Type-C | |
If you can’t find what you're looking for in the downloads, public documentation, or forum, please contact your NVIDIA representative or reach out directly.
Resources
Peek under the hood to experience NVIDIA’s latest autonomous driving innovations through DRIVE Labs and DRIVE Dispatch.