• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

I think this is unlikely otherwise we would have seen benchmarks already
I think it is basic maths and am not a conspiracy theorist, but whatever :) more Cuda cores (27%) faster memory (23%) = faster card.

There was lower increase of CUDA Cores (22%) between 980ti & 1080ti and no increase in memory speed and we saw significantly higher than 20% improvement in many games - so there is clearly zero evidence to support your theory and in fact significant evidence to the contrary.
 
Last edited:
You have a link to a tiny sample of games of which many do not achieve 60fps. The 2080Ti is expected to be 40%+ improvement on 1080ti just from Cuda core increase and gddr6 alone - probably much more with dlss support - it will eat 4k alive if it is.

It's a tiny sample and yet you say "of which many do not achieve 60fps". More in that list do than don't, and out of the 4 that don't one only misses by 1 fps. The 2080Ti is extremely unlikely to be 40% faster than a 1080Ti the floating point performance is almost exactly 20% greater and this is likely to be the biggest performance factor between the two cards. Anyway my point still stands, unless you're trying to get 120fps+ in 4K, 2080Ti SLI is overkill, and a single card should be perfect for 60fps 4K gaming in most games.
 
It's a tiny sample and yet you say "of which many do not achieve 60fps". More in that list do than don't, and out of the 4 that don't one only misses by 1 fps. The 2080Ti is extremely unlikely to be 40% faster than a 1080Ti the floating point performance is almost exactly 20% greater and this is likely to be the biggest performance factor between the two cards. Anyway my point still stands, unless you're trying to get 120fps+ in 4K, 2080Ti SLI is overkill, and a single card should be perfect for 60fps 4K gaming in most games.

40% of the sample (4 out of 10) did not reach 60fps - that is not a good % to support your argument and as I said above:

"There was lower increase of CUDA Cores (22%) between 980ti & 1080ti and no increase in memory speed and we saw significantly higher than 20% improvement in many games - so there is clearly zero evidence to support your theory and in fact significant evidence to the contrary."
 
40% of the sample (4 out of 10) did not reach 60fps - that is not a good % to support your argument and as I said above:

"There was lower increase of CUDA Cores (22%) between 980ti & 1080ti and no increase in memory speed and we saw significantly higher than 20% improvement in many games - so there is clearly zero evidence to support your theory and in fact significant evidence to the contrary."

mic drop
 
40% of the sample (4 out of 10) did not reach 60fps - that is not a good % to support your argument and as I said above:

"There was lower increase of CUDA Cores (22%) between 980ti & 1080ti and no increase in memory speed and we saw significantly higher than 20% improvement in many games - so there is clearly zero evidence to support your theory and in fact significant evidence to the contrary."

Pointless arguing with someone who is ignoring what I am saying: "unless you're trying to get 120fps+ in 4K, 2080Ti SLI is overkill, and a single card should be perfect for 60fps 4K gaming in most games."

Anyway 980Ti Single precision FP16 performance was 5.6TFLOP, 1080Ti is 11.34TFLOP, 2080Ti is 14TFLOP. How many cuda cores is not relevant, it's what their overall performance is.
 
Last edited:
https://youtu.be/auogXMCH4q8

Then why all the shady business?
It isn't shady - you only have to read the amount of (quite frankly) hateful and unfounded bs in this thread to understand their position. There are a lot of people who simply want Nvidia to fail and the GPU review market has a looooooong history of false information an cherry picking (on both sides) - they are just protecting their business.
 
40% of the sample (4 out of 10) did not reach 60fps - that is not a good % to support your argument and as I said above:

"There was lower increase of CUDA Cores (22%) between 980ti & 1080ti and no increase in memory speed and we saw significantly higher than 20% improvement in many games - so there is clearly zero evidence to support your theory and in fact significant evidence to the contrary."

But pascal gpu clock is significantly higher than Maxwell. We are not seeing that here with turing
 
probably much more with dlss support - it will eat 4k alive if it is.

You seem to put a lot of faith into this DLSS technology. As far as we can tell from the information out there, it's basically "fake 4K", more akin to checkerboard rendering on consoles. i.e. it renders at a lower than 4k resolution, then gets up-scaled and anti-aliased to 4k, so of course it will allow better frame rates at 4k output, because it isn't actually rendering at 4k, but at a lower resolution.
 
https://youtu.be/auogXMCH4q8

Then why all the shady business?
Why is it shady? NDAs are common in all the world for product reviewers and I would also like my product to be reviewed fairly and with the latest drivers and not some random whom I haven't approved and using older drivers. If reviewers are worried about signing it, they don't have to and they can buy their product like the rest of us!

People have to remember that RT/DLSS is new and most likely not running as efficiently as it should or at all on older drivers, so NVidia want the reviewers doing it correctly. I don't see it as shady but a sensible move.
 
I'm wondering if they're holding back benchmarks with these NDA's because nvidia know themselves the cards are a disappointment.

They're just giving themselves a buffer period to let the stupid peeps drop insane amounts of money on a GPU with absolutely no idea how good/bad they are before letting the negative benchmarks come out/sales plummet :D

Probably wrong, but if a company has truly made something they're proud of, especially in an industry where numbers are everything (fps, resolution etc) why would they not share a single thing about them? It's all getting a bit PR stunt-ish!
 
Why is it shady? NDAs are common in all the world for product reviewers and I would also like my product to be reviewed fairly and with the latest drivers and not some random whom I haven't approved and using older drivers. If reviewers are worried about signing it, they don't have to and they can buy their product like the rest of us!

People have to remember that RT/DLSS is new and most likely not running as efficiently as it should or at all on older drivers, so NVidia want the reviewers doing it correctly. I don't see it as shady but a sensible move.
If it's better it's better, simple as that. Give it to the same reviewers as normal and watch it win against the old generation. It's a simple process of making the new card perform better than the last so that no matter who tests and reviews it can clearly see its better and worth the money that it is being sold for. When did this simple process stop being a thing?! It's bullcr*p and you know it, all this "control" stinks of making the card appear in a certain light. Any GPU worth 1500 quid in this day in age shouldnt need a particular light to be exposed in, it just should be good enough in itself to say "buy me, I'm worth an upgrade from your last GPU"
 
If it's better it's better, simple as that. Give it to the same reviewers as normal and watch it win against the old generation. It's a simple process of making the new card perform better than the last so that no matter who tests and reviews it can clearly see its better and worth the money that it is being sold for. When did this simple process stop being a thing?! It's bullcr*p and you know it, all this "control" stinks of making the card appear in a certain light. Any GPU worth 1500 quid in this day in age shouldnt need a particular light to be exposed in, it just should be good enough in itself to say "buy me, I'm worth an upgrade from your last GPU"
Solid points but I do feel that NVidia have not really had time to finalise drivers, so this is probably why the delay. Like I said, I would want my product to be shown as it should be and not showing lower because drivers are not ready. Of course I could be wrong but looking at the big picture, this makes sense or at least to me.
 
Solid points but I do feel that NVidia have not really had time to finalise drivers, so this is probably why the delay.

Really? Weren't these cards ready about 8 months ago but just sitting on a shelf waiting for all the currencies to die down before releasing?

Let's face it, why would they release a new version when they're still raking in profits from their older gen? Makes no sense, personally, I believe they've had ****** ages to sort drivers for these.
 
Back
Top Bottom