Microsoft today at GDC announced the Raytracing feature for it’s DirectX 12 API, dubbed as DirectX Raytracing.
DirectX 12 may have been around for a few years now, it was revealed by Microsoft at GDC 2014, to be precise. However, DirectX 12 adoption has still not progressed very far. The foundations have been laid though and the rest of the job falls to game devs and hardware manufacturers, namely NVIDIA and AMD.
As NVIDIA is currently prepping it’s Volta (or Ampere/Turing) for launch later this year, same way Microsoft’s DirectX teams are hard at work on the next generation of graphics technology.
This morning at GDC 2018 as part of a coordinated release with NVIDIA, Microsoft announced a major new feature addition to the DirectX 12 graphics API: DirectX Raytracing. Now what exactly is ray-tracing? We’ll get to that in a while.
DirectX Raytracing is meant to provide a standardized means for developers to implement ray tracing in a GPU-friendly manner. As an extension of DirectX 12, DXR is supposed to be integrated with traditional rasterization, allowing developers to mix the two rendering techniques to get the best out of DirectX 12. In performance deficient scenarios Raytracing maybe dropped for the older implementation, while if quality is the priority then it can be leveraged.
Ray Tracing: A Brief History
Historically, ray tracing and path tracing have by far been the superior rendering techniques. By rendering a scene like the human eye, it can produce a far more accurate image, especially when it comes to lighting and shadows. Ray tracing works similar to the way our eye-sight does. Only in reverse.
Casting rays out from the viewer to objects and then bouncing from those objects to the rest of the world, ultimately determining the interactions between light sources and objects in a realistic manner. As a result, ray tracing has been the go-to method for high quality rendering, particularly static images, movies, and even cinematics and cut-scenes in video game.
However the performance overhead of ray tracing is more than just taxing, not only due to the various interactions with the environment, but also the sheer number of rays involved. Just look at NVIDIA’s HFTS which leverages ray tracing and that too is not a full-fledged implementation. More of a raw, partial concept. When you enable it in a game, it cuts the frame-rate by more than half. That is the impact of traditional ray-tracing even in the crudest of forms on performance.
There is a ray for every screen pixel, cast, reflected, refracted, and ultimately recursively generated many times over. Bouncing from object to object, refracting, diffusing along other objects, in order to determine all of the light and color values that ultimately impact a single pixel.
As you might of already concluded by now, due to the performance penalty ray-tracing is not used in present day games. Instead, real-time rendering is done using rasterization, a process that renders the screen using approximations. Most of the lighting, shadows and postprocessing is done in 2D using pre-defined maps of sorts.l
This has made real-time rendering an actuality. As expected though you trade quality for performance. The shaders like the pixel and compute shaders while do a decent enough job, they can’t quite compete with Raytracing. This is evident if you compare NVIDIA HFTS to NVIDIA PCSS. The coverage, detail and the accuracy is much higher in HFTS.
While the dominant rendering technique will still be rasterization, but as far as quality is concerned Raytracing is the uncontested winner. Since forever, there has been has a lot of work going into merging ray tracing with rasterization in order to get the best of both the rendering techniques. Combining rasterization’s efficiency and existing development pipeline with the accuracy of ray tracing sure sounds like a promising venture, and it sure is.
While the implementation of this hybrid technique at the API level is upto developers, the most cost effective one would be to use Raytracing to render the shadows and lighting. That can be followed by a last round of pixel shaders to integrate the two and add any remaining effects. This way we can make the most of raytracing by exploiting it’s unmatched lighting and shadows without sacrificing the performance benefits of rasterization.
DirectX Raytracing is an attempt by Microsoft to basically do just that. So that there is a universal rendering method to leverage Raytracing in rendering along with rasterization in the best possible way. As already mentioned above, present GPUs can simulate raytracing of a raw form, however the hardware not being specialized for it, it’s not very efficient or widespread.
So to allow hardware accelerated Raytracing and make it more standardized Microsoft has made the commands available in the DirectX 12 API. However, just like the adoption of DirectX 12, it’ll be quite a while till Raytracing begins popping up in games.
At the base level, DXR will have a full fallback layer for working on existing DirectX 12 hardware. Microsoft is pitching the fallback layer to the developers as a way to get started today with Raytracing. This will allow developers to immediately try out the API and get familiar with it. This will give them a basic idea of the pros and cons, so that when the needed hardware hits the market, they’ll be ready and won’t run into any roadblocks.
Apart from inviting devs, the fallback layer also allows any DX12 based hardware to leverage DirectX Raytacing, and it probably won’t be long till we start seeing tech demos based on this new API feature.
Microsoft is merely doing the very basics. The legwork will have to be done by the GPU vendors. The responsibility of optimization and to making sure DirectX Raytracing runs without any hiccups on their respective architectures, NVIDIA and AMD will have a lot to do.
Microsoft is however, giving GPU vendors the means to accelerate DirectX Raytracing on their respective hardwares in order to further close the performance gap between raytracing and rasterization.
So basically, DirectX Raytracing might not be a new revision of the API, it most certainly is as important. More significantly, the adoption of DirectX 12 is still far from complete. Raytracing will almost certainly take as much if not more to become a prominent feature in games.
Developers like Epic Games, Futuremark, DICE, Unity, and Electronic Arts’ SEED group are already announcing plans to integrate DirectX Raytracing into their engines.
Another noteworthy aspect is that Microsoft hasn’t said a word about the use of DirectX Raytracing in the Xbox One. That is probably because of the hardware limitations of the console. DirectX 9 was erased by the release of the 8th gen consoles where DX11/12 was a baseline.
Microsoft could do something similar with DirectX Raytracing, although that’ll probably take a couple of generations to mature.
NVIDIA on the other hand has quite a lot to talk about today. They are simultaneously announcing that they will support hardware acceleration of DirectX Raytracing through their new RTX Technology. RTX combines Volta architecture ray tracing with optimized software support to provide DX Raytracing H/W accleration, while older cards will have to use the fallback option.
Meanwhile AMD has also announced that they’re collaborating with Microsoft and that they’ll be releasing a driver in the near future that supports DirectX Raytracing. The absence of AMD at the event makes it look like a Gameworks feature where AMD has no involvement whatsoever.