The world’s first entirely ray-traced game is here. Q2VKPT is the name of the project and it’s completely open-source. It’s freely available for download on GitHub. This research project is based on the Quake II engine, utilizing it to create fully dynamic real-time lighting.
Q2VKPT is powered by the Vulkan API. Thanks to the advent of the RTX cards, this game performs ray-tracing at respectable speeds. It is able to produce 60fps at 1440p with the RTX 2080Ti GPU. The project consists of 12k lines of code and completely replaces the graphics code of Quake II. This concept of ray-tracing, previously only available in live-action films, is now a realistic future for video games.
Here are some parts from the Q&A:
Quake II is ancient! If these techniques have any future, it should run at 6000 FPS by now!
While it is true that Quake II is a relatively old game with rather low geometric complexity, the limiting factor of path tracing is not primarily raytracing or geometric complexity. In fact, the current prototype could trace many more rays without a notable change in frame rate. The computational cost of the techniques used in the Q2VKPT prototype mainly depend on the number of (indirect) light scattering computations and the number of light sources. Quake II was already designed with many light sources when it was first released, in that sense it is still quite a modern game. Also, the number of light scattering events does not depend on scene complexity. It is therefore thinkable that the techniques we use could well scale up to more recent games.
Why Quake II?
Since Quake II is open source and has a long-standing modding tradition, it is a great sandbox for putting academic research to the test in the real world. Particularly, the game has fast-paced action and is played competitively, setting high standards for the performance and robustness of any implemented rendering techniques. Finally, in some sense, Quake II is to this day quite a modern game since it already shipped with complex and artistic light design back when it was first released.
How is path tracing different from raytracing?
Path tracing is an elegant algorithm that can simulate many of the complex ways that light travels and scatters in virtual scenes. Its physically-based simulation of light allows highly realistic rendering. Path tracing uses Raytracing in order to determine the visibility in-between scattering events. However, Raytracing is merely a primitive operation that can be used for many things. Therefore, Raytracing alone does not automatically produce realistic images. Light transport algorithms like Path tracing can be used for that. However, while elegant and very powerful, naive path tracing is very costly and takes a long time to produce stable images. This project uses a smart adaptive filter that re-uses as much information as possible across many frames and pixels in order to produce robust and stable images.
Are path tracing and raytracing the future of game graphics?
The recent release of GPUs with raytracing capabilities has opened up entirely new possibilities for the future of game graphics, yet making good use of raytracing is non-trivial. The purpose of this project is to find out exactly what’s still missing for a clearer pathway into a raytraced future of game graphics. While some problems have already been addressed by academic research, many open real-world problems slip unnoticed until one actually tries to implement a full game renderer. We plan to look into some of these issues, like better light sampling, better filtering, and more consistent renderer software architecture in the future. Good solutions for the new, different issues of raytracing- and path tracing-based renderers will be necessary for this change to happen on a broad scale.