Over the last few years, if you're a gamer or PC hardware enthusiast, you've probably heard the term "ray-tracing" bandied about quite a bit. Actually, you likely heard it before that, but with the advent of Microsoft's DirectX Ray-Tracing and NVIDIA's RTX branding,
ray-tracing has hit the mainstream.
The original concept of "ray-tracing" for 3D images actually dates all the way back to the 16th century, but the first description of the idea in computer graphics was created in 1969. That earliest algorithm was very basic and only an approximation. Ray-tracing went through various evolutions over the years until, in 1986, Jim Kajiya presented a paper called "The rendering equation and its use in computer graphics." In just seven pages, Kajiya described a way to mathematically compute the physical properties of light, and it was this revelation that transformed 3D rendering.
Kajiya described his method as "path tracing." It can be considered an evolution of simple ray tracing. Getting good quality out of a path tracer requires casting billions of rays. Until relatively recently, it was considered to be a completely impractical approach for real-time rendering, but it has been used in films and other offline-rendered computer graphics for over 15 years.
The advantage of path tracing is that it is a "unified" rendering algorithm. This means that you don't have to compute separate lighting effects like ambient occlusion, reflections, soft shadows, and so on. All lighting is done through the path tracing algorithm. This has its disadvantages, too, particularly when trying to compute path tracing on modern GPUs that are largely optimized for traditional rasterization workloads, but it is overall a huge net gain for image quality and for artists.
Doing full path-tracing in real-time isn't completely novel; that's
how Quake II RTX works, in fact. A decade ago, the Otoy Brigade team was doing software-based path-tracing. Both of these approaches make a lot of compromises, though. Brigade has limitations on the light interactions it can model, while
Quake II RTX only includes very simple scenes with basic geometry and performs a maximum of two bounces per ray with just a couple of lights in each scene.
Well, NVIDIA wants full-quality cinematic path-tracing in real-time, and at GTC 2022, the company showed off something very close to that. By mixing its RTXGI and RTXDI SDKs into a customized "research renderer", NVIDIA's graphics scientists have been able to achieve results that can be fairly-well described as "incredible."
These are scenes with up to three billion triangles that are fully path-traced—no rasterization—being rendered at "interactive" frame rates despite the renderer using up to thirty bounces per ray. Some of these example scenes have as many as five-hundred thousand "mesh lights," essentially individual polygons that are light sources. While these scenes don't necessarily look photographic, they're an entire grade above anything that you'll see in a video game in 2022.
Real-Time Path Tracing In Action - Credit: NVIDIA
There aren't a lot of caveats with this technique, either. It's not like Euclideon's "unlimited detail" renderer from
nearly a decade ago that was voxel-based and completely unsuited for games. This renderer uses standard 3D models, and fully supports animated meshes, physics, and post-processing effects, just like any other standard 3D renderer.
One way to do something faster is to work harder, but arguably a much smarter way is to reduce the amount of work you need to do. Much like with
DLSS, NVIDIA's approach to accelerating path tracing has been one of approximation. Of course, it relies heavily on de-noising; the image below illustrates the effect of NVIDIA's advanced denoising filter with an evocative before-and-after comparison.
You aren't likely to see scenes of this complexity in games any time soon—most of these scenes are barely scraping 30 FPS on a
GeForce RTX 3090—but these rendering techniques are already available to developers with the full-source RTX Direct Illumination (RTXDI) and RTX Global Illumination (RTXGI) SDKs. Folks working in Unreal Engine can also grab binary plug-ins for that toolkit.
NVIDIA's researchers are also quick to admit that there's more to be done before this path-tracer can completely replace rasterized renderers. Because much of the optimization is based on discarding the most difficult work, very difficult scenes are still slow and ugly. Likewise, certain effects, like volumetrics, play hell with the algorithms. You can see in the image below how the fog looks stippled and noisy.
If you have PowerPoint—and you'll need the real thing, Google Docs barfs on the file—you can head to NVIDIA's site and download the full 660 megabyte
presentation to see the videos for yourself, but if you don't, you can check out smaller, compressed versions on Green Team researcher
Aaron Lefohn's Twitter. You can also check this NVIDIA blog post
to learn more about path tracing in general.
You can also check this NVIDIA blog post to learn more about path tracing in general.