With NVIDIA’s latest generation of graphics cards less then a month away, benchmarks and other leaks have started showing up online. One such leak is an image of the GTX 1170 benchmark scores from this forum.

The image shows a 3DMark result of the GTX 1170 with an astounding 16GB of VRAM. The GPU clock speed is 2512MHz, while the memory bus clock is at 2,552MHz. The score of 22989 is even higher than that of a 1080Ti, which is very suspicious since there isn’t a significant bump in performance going from TSMC’s 16nm to 12nm nodes.

Another hideous aspect of these images is the benchmark windows that are concealing some details. We also need to consider the fact that the 1170 is a mid-range card and considering how expensive VRAM is, 16GB doesnt quite cut it.

Maybe its a bit too soon for plausible leaks, but if there’s even the slightest chance of these being true, it would be a massive game changer for the entire industry.

Advertisements

2 COMMENTS

  1. I stopped buying these high-end cards many many years ago . fortunately, i was clever enough to see what this all means.

    It’s a never ending race, a quest for that extra bit of power… and 1 month after the purchase, all new released games can’t be run at ultra settings with all sliders at 100%, because drivers aren’t mature enough, the engine is badly optimized, etc etc.

    Last decade, gamers already had 2000, 3’000, even $ 6’000 rigs, with expensive cpus with many many cores… the latest dx, the latest everything. .. yet most games, if not ALL, were only running on a single giant thread, and 80% of the cpu wasn’t used !

    I’m sure even the most recent games still aren’t optimized to use 100% of the cores and memory.

    ‘Oh, you know, dumbass, games barely use the cpu, everything is running on the gpu’…
    … yes, why?… maybe because developers don’t want to spend time writing code to run on the cpu…
    If they were doing their job, any game should be able to use all the gpu power, as well as using the processing power of that big 16cores/4Ghz cpu.

    And why this ?
    Because all studios work hand in hand with nvidia and amd, and code their games so they ONLY use the gpu power.

    HEY, if a gamer could have 60fps and very high graphics,runjing ONLY on the cpu, they would probably only buy a 50-$100 card to connect the system to the monitor.

    But by making a game 100% gpu-dependent, gamers are forced to buy a CPU AND a powerful graphic card. Business, as always.

    It’s all about business.
    Just like nvidia and amd could sell a 20k card, 30 times more powerful than a titanx… to BE USED by 3d pros… but that card can’t run a Pong game at 800*600 ! Why ? Because they will lock all the gaming features through the drivers.

    A gaming card can’t do 3dmax, etc well…. and a pro 3d card can’t do 3d gaming !
    While probably 99.9% of the components are similar.
    Fuk these companies, seriously.
    A guy will spend 5000 bucks on a triple titanx… but if he tries to do 3d imaging, he won’t even get 5% of the perfs of a $1’500 pro card…

    Long comment, but, really, no way i will give them my money. They don’t deserve it, and i hate when one tries to fool and rob me.

    I could bet all these expensive graphic cards aren’t even outputting 20% of what they can really do ! Nvidia, amd, DX12… they are making unoptimized drivers, so these cards are never ‘powerful enough ‘.

    Because hey, if guy could spend 2000 bucks on a card, and could run any game, until 2022, at 4k, ultra settings, well, he wouldn’t need to buy a new card.. and another one… and another one…

    Devs, manufacturers, drivers makers, game engines, like unreal engine, Dx12…NOBODY is REALLY fully opytimizing their stuff ! If the same level of optimization utilized on consoles was apllied to computers, today, a $300 card with enough memory could probably be able to run a modern game at triple 4k/60fps/ultra settings !

Leave a Reply