Okay, so this is a bit of surprise really. Folks over at 3dcenter have discovered that they can run Battlefield V with raytracing enabled on the NVIDIA Titan V. Now, I know what you’re wondering- that card is an abomination and shouldn’t even exist cos of its $3000 price point. But here’s the thing, the Titan V lacks the RTCores which as per NVIDIA make the RTX cards raytracing worthy, but it’s still able to produce the same performance with
That’s a complicated question. We already know that the RTCores are raytracing accelerators and aided by the Tensor cores they make the revered technology a possibility. But if you compare the benchmarks of the Titan V and the RTX 2080 Ti, the difference is barely notable. One user reported 69 fps at 4K resolution w/ Ultra high settings and another claimed 60-100 fps at 1440p Ultra setting with 80 fps as the average. That’s a mere couple of
This really questions the real ability of the RTCores and how much of an effect they really have on the raytracing performance. It’s true that raytracing can be done without dedicated hardware, using a software fallback option but you need an insane amount of processing power for that. The Titan V is based on the enterprise version of Turing, Volta, so it’s possible some of the optimizations for RTX are present there and perhaps the existence of Tensor cores smooth things over.
But once again, do we really need RTCores then? And how much will the raytracing performance vary without them in the various GeForce RTX cards, especially in the 2070 and the upcoming RTX 2060. Lastly, it’s completely possible that the higher end cards like the Ti and Titan RTX don’t really need the RTCores to run raytracing at reasonable frame rates. That would mean that NVIDIA is just charging a premium for something that isn’t even properly utilized. And yes, this hasn’t been confirmed yet so don’t call your lawyer just yet. We’ll let you know as we find out more about this. Cheers!