NVIDIA may be the dominant force in the graphics card market, but that doesn’t mean the company hasn’t faced its share of criticism for anti-competitive and anti-consumer practices. One of the most prominent arguments concerns NVIDIA’s support for older architectures or lack thereof.

Many enthusiasts have claimed that as NVIDIA releases newer hardware, it degrades the performance of older GPUs to keep the latest offerings more competitive and “persuade” consumers to buy them. In this post, we look at the performance figures of the Kepler-based GeForce GTX 780 over the past driver releases in major titles.

These tests were conducted using a GTX 780 running at a constant 1202 MHz paired with a Haswell i7-4770K Oc’d to 4.5 GHz

Benchmarks

Crysis 3

Does NVIDIA Gimp Older Generations of GeForce Cards?

Crysis 3 is no longer the hardware-killer it was at the time of launch, but for the purpose of stress-testing and benchmarking it’s still a solid game. Using the different driver versions released over the past several years, we see that the performance of the GTX 780 doesn’t decrease but as a matter of fact sees a marginal boost. So, we can safely say that the performance hasn’t deteriorated and no gimping can be confirmed in case of Crysis 3.

The Witcher 3

Does NVIDIA Gimp Older Generations of GeForce Cards?

The Witcher 3 when it released was hailed as the Crysis killer and it’d be safe to say that running the game at higher resolutions using the ultra preset was no mean feat. Although with Maxwell and Pascal, the game is less of a menace on the performance front, it nevertheless looks really pretty and is a fantastic game to boot. Looking at the performances of various driver versions, the performance has once again improved with each successive update.

Mass Effect Andromeda

Does NVIDIA Gimp Older Generations of GeForce Cards?

Say what you will about the rest of the game, but Mass Effect Andromeda looked gorgeous on PC, especially at 4K. For a third time, we see that the drivers improve performance and not reduce it for the 5 year old GTX 780.

Doom

Does NVIDIA Gimp Older Generations of GeForce Cards?

Bethesda’s demon-shooting sim Doom is one of the classic FPS games. The latest iteration of the game uses Vulkan and OGL and packs a decent amount of detail. The older pre-release drivers don’t really support the game, but the newer three run the game at identical frame rates.

Deus Ex: Mankind Divided

Does NVIDIA Gimp Older Generations of GeForce Cards?

Deus Ex: Mankind Divided is nearly three years old, but continues to be one of the most detailed games. The 322.21 driver shows dismal lows, but the problem seems to have been fixed in the newer releases. This game once again shows improvements in frame-rates with newer driver releases. No gimping.

Watch Dogs 2

Does NVIDIA Gimp Older Generations of GeForce Cards?

Watch Dogs 2 shows identical performance metrics across the board using the very high preset at FHD. Considering that the debut title in the franchise performed quite terribly at launch on the 780s, this is a pleasant surprise.

Rise of the Tomb Raider

Does NVIDIA Gimp Older Generations of GeForce Cards?

Rise of the Tomb Raider is one of NVIDIA’s partner titles, and features a fair deal of Gameworks effects. Considering that, you’d expect the performance to improve with driver updates, but unfortunately that’s not the case. The GTX 780 maintains constant performance from driver v322.21 to 411.70, and although there are minor fluctuations they are within the margin of error. Regardless, no performance deterioration.

Conclusion

So as you can see, we can safely conclude that NVIDIA doesn’t ditch its older hardware as newer models are released. In fact, in some cases we even saw improvements with each successive driver release. You may notice the newer cards see gains that the older architectures lack, but you can’t really blame NVIDIA for focusing on the flagship generation and giving a 5 year old architecture reduced priority. Hence the conclusion- NVIDIA does not gimp or deteriorate the performance of non-flagship generations.

Further reading:

Advertisements
My interests range from Human Psychology to Computer Hardware. I'm a perfectionist and I only settle for the best, both when it comes to work and play. Yeah I know I'm no fun at parties. I started TechQuila with a friend as a hobby and currently I'm the Editor-in-Chief here. I'm also pursuing a degree in Engineering and write mainly for the Gaming and Hardware sections, although every once in a while I like to test my skills in the other categories too.

2 COMMENTS

  1. Thanks for this unbiased and highly useful article. I’ve always wondered about this “myth” and you’ve given us a very insightful first step to debunking it. Perhaps you can now test an AMD card to compare the “fine wine” myth to this?

  2. One thing is sure:
    These high-end gaming cards probably could output muuuch more power than they actually do.

    Just look at the pro cards used by graphic professionals : they cost several thousands dollars, they have much more memory… they have 2 or 3 times more components than standard gaming cards…

    But, with a SIMPLE DRIVER, Nvidia manages to turn those graphic monsters, that could easily run games at even 8k resolutions…. into dumb cards, that can barely run games at 720p 30fps, low settings. ‘Because they weren’t built for gaming’. Seriously ?

    Tell me, if there were many people wanting to spend 5-10’000 dollars on a pro graphic card , in order to have a card 3-4 times more powerful than, eg, a titan x, why would Nvidia not want them to do so ?

    Everythibg is made so gamers always need the latest card, in order to run the latest released titles.

    When was the last time a card was released, so powerful, with so extra processing power, that it was able to run all the recent games released 12 or 18 months later ? Never, right…

    The released cards are always ‘just powerful enough’ to run the most recent games, at ‘almost’ the best possible quality. ..without ever being able to run games at, say, max settings, and 240fps. No. Strange, isn’t it ?

    You buy the latest card, today ? You can only run the 18-6 months old games at max settings, and the most recent games, you can run them at max settings… but only 15 or 25fps. NEVER, MORE.

    Take the 2080. If Nvidia wanted, they could easily add 10 or 20% more components, more ‘graphic units’, and all the components used to create and handle graphics… and voila, they could sell a card power, even at 100 or 200 extra bucks, that could run ALL RECENT games, at max settings, at 90 or 120fps, and 1 year later, that card would still be able to run ALL released games, at max settings and at least 60fps.

    but no.

    the hardware used, the cards released, are always just powerful enough, for the games released 6 to xx months before, and barely poweful enough to run all the new games, at max settings. Again, strange, huh ?

    no matter which card a gamer might buy, today, by the time a new model is released, the card bought today will no longer be able to run a new game, at a minimum of 60 fps and very high settings. NO. A new card will be required, to run that NEW title, at max settings and 45-60fps.
    do you really think this isn’t on purpose ? Think twice…

    Ok, Nvidia makes a new card, and sells it at $1’200. Where is the problem about making a card 20% more powerful, (please, don’t talk me about heat, bla bla), that gamers could buy, for 1400 or 1’500 bucks ? Apple has no problems, or shame, when selling a top end iphone at almost 2000 dollars ! Millions of gamers spend 600 bucks on a motherboard, 500 bucks on a case, or 1500 bucks on overclockable ram, just to gain 1% extra processing power. Why would’t they pay 300 or 400 extra bucks, like $1500 instead of 1100, on a much better graphic card ?

    The market is there, ready to spend 1500, 2000, 3000 bucks on a gaming graphic card…

    It is obvious nvidia don’t want gamers to keep their graphic cards more than… 1 year, or… the exact time it takes until a new model is released.
    NO.
    They want gamers to always NEED some extra power, to run that new game they love.. at higher settings and framerate.
    No matter which card a gamer may buy, IT will ALWAYS be powerful enough to play the recent games at less than ideal settings and conditions. ..so that QUEST for the BEST graphics and best framerate NEVER ENDS.

    Now, if, with a single ‘rottenl driver, nvidia can turn a monster PRO graphic card, with dozens and dozens tflops, into a huge turd barely capable of running quake 3 at 4k,
    WHY,
    tell me,
    WHY,
    Wouldn’t they purposely limit&lock their hardware and drivers, to force people to upgrade at least every 12 months…?
    Why wouldn’t they do that ?
    What if they released a card, so powerful, that it could run games at ultra, 60fps… by 2021 ?
    Yeah… gamers wouldn’t have to upgrade every year… and that’s a problem.

    That’s why i believe a graphic card could do so much more…

    And all this becomes even more logical, when we see how weak consoles are, compared to a highend pc.. 8 cores at barely 2ghz… 5 gb of memory… super cheap and undrclocked graphic chip….. and devs manage to run games like god of war, far cry 5, etc, with more compressed textures, less particle effects, more poor lighting or shadows… but still looking amazing, at 30 or 60 fps. Ohhh, the famous ”’code to the metal’… ?

    To run a god of war /shadow of tomb raider ps4 game ON a computer, we’d probably need a 16core cpu at 5ghz, and 2 or 3 2080 chips…

    Now, if pc devs are ‘coding to the meta’ since many years, and they are doing their job correctly, why are games requiring such high specs ?

    Because…. the graphic chip nvidia is selling, on pc, its power is obviously being locked & limited. By everything I’ve said above.

    With specs 20-30 times higher than a console, games aren’t looking 20 times better, nor are running at 240fps.

    These nvidia cards could be generating at least 30 to 50% more processing power. But no. That’s not what nvidia wants. They want just enough power to run old-recent games, CLOSE to max settings and 60fps. CLOSE.

Leave a Reply