Extended Reality (XR)

Extended reality (XR) is the merging of physical and digital worlds, enabling users to interact with virtual environments, integrate digital elements into reality, and create transformative experiences in entertainment, work, and beyond.

How XR Works

Extended reality, a cornerstone of spatial computing, works by blending the physical and digital worlds using immersive technologies like augmented reality (AR), virtual reality (VR), and mixed reality (MR). AR overlays digital content onto the real world, VR fully immerses users into a virtual environment, and MR seamlessly combines real and virtual elements, enabling interaction between the two. These experiences are powered by advanced hardware like NVIDIA GPUs, optimized drivers, and versatile SDKs, which process complex data in real time to deliver lifelike visuals, synchronized interactions, and responsive environments.

Explore XR Technologies

NVIDIA CloudXR

NVIDIA CloudXR™ is for streaming virtual, augmented, and mixed reality content from any OpenVR XR application on a remote server—cloud, data center, or edge.

Get Started With CloudXR

NVIDIA Omniverse

NVIDIA Omniverse™ is a development platform for building 3D applications and services that enable seamless collaboration and creation in immersive 3D environments.

Get Started With Omniverse

NVIDIA Professional VR Ready Solutions

NVIDIA VR Ready systems deliver unparalleled performance for smooth, immersive virtual reality experiences.

Get Started With VR Ready

NVIDIA VR Capture and Replay

NVIDIA Virtual Reality Capture and Replay (VCR) enables developers to accurately capture and replay VR content for performance testing, scene quality control, and more.

Get Started With VCR

NVIDIA VRWorks

NVIDIA VRWorks™ enables VR application and headset developers to create amazing virtual reality experiences, improving VR performance and visual quality.

Get Started With VRWorks

NVIDIA Warp

Warp enables Python developers to create GPU-accelerated, 3D simulation workflows that drive ML pipelines in PyTorch, JAX, Modulus, and NVIDIA Omniverse™.

Get Started With Warp

Developer Starter Kits

Explore templates and resources to develop solutions that utilize XR.

Spatial Streaming for Omniverse Digital Twins

Learn more about streaming immersive OpenUSD-based Omniverse digital twins to the Apple Vision Pro with this reference workflow.

Robotic Data Capture and Generation

Explore how Jetson AGX Thor, Cosmos, Isaac GR00T, and Apple Vision Pro’s spatial computing power advance humanoid robotics in this blueprint.

GR00T-Teleop stack is in invite-only early access – join our Humanoid Developer Program for when it becomes available in beta

XR Learning Resources