• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Current reports are 50% performance per watt improvement for RDNA2 over RDNA in gaming workloads.

From AMD's own slides, vendor slides are hardly realsitic in the real world

From memory, 7nm wasn't a huge improvement in performance per watt over 12nm, rather it targeted tighter packed transistors, but I can't find a reference for this...

You are confusing 7/12 with 12/16. As TSMC's 12nm is a sub node of 16nm

No doubt Ampere is going to be a huge leap as well! Excited to see what the results are!

I hope so! New node and new architecture should be a big jump
 
I've been out of the loop for a while. How many RT cores would we need on current architecture to fully ray trace a modern game like BF5 at 4k? I'm guessing it would be in the tens of thousands?

To do traditional style full scene, full screen ray tracing at 60FPS 4K in a game like BF5 you are talking an order of magnitude more performance than the 2080ti but with a decent path tracing and de-noising implementation especially with a good DLSS implementation to go with it then quite a bit less than that. People keep ignoring that the 2080ti can do almost 100 FPS in Quake 2 RTX at 1080p and nearly 30 FPS at 4K and that isn't going to hugely change with polygon scene complexity as is the nature of path tracing - you can bump the level of geometric detail up ~64x and only take a ~3% framerate hit (though some other elements of scene complexity such as transparent materials, etc. will have more of an impact).
 
To do traditional style full scene, full screen ray tracing at 60FPS 4K in a game like BF5 you are talking an order of magnitude more performance than the 2080ti

If Quake is anything to go by you only need a little over double the performance. I got 28 fps in Quake RTX at 4k.
 
Seems fake to me. That is a huge difference between the 3070 and 3080Ti. Unless they intend to price the 3070 at $399 or something to compete with what AMD are bringing and know there will be no competition for their 3080Ti which they will charge $1499 lol :p

Seems they have a lot more RT cores which I have always said the 3000 series would have :D

I tend to agree, also GA100 for a 3080Ti doesn't seem right either but hey that could mean anything right who knows either way its not helpful information imho
 
I did, wasn't willing pay the 2080ti price to upgrade my now 3 year old 1080ti and there wasn't any other options so I brought nothing.

Same here. Even if the 2080 Ti released at the same price as the 1080 Ti, 30% improvement would have been "nice" but not "OMG I must have it!".

I doubt Nvidia monitors this forum, but I'm sure they have bean counters that have told them what many of us are doing. We should know if they got the message by the end of the year.
 
I meant GloFo 12nm to TSMC 7nm, meaning to suggest performance gains moving from GCN to RDNA were largely due to architectural improvements and not process improvements. But yeah, wasn't clear...

the 'glofo' 12nm you mentioned is just Samsungs version of glofos' 14nm.

There was defiantly some gains on the node change, just look at Radeon VII which was Vega 64 goflo 14nm die shrink to TSMC 7nm, but in that case they used most the direct power savings on higher clocks.
 
Hearing of an upcoming fire sale of RTX 2xxx products to clear channel? Hearing for more than a £150 price cut for Asus/Gigabyte and MSI products
So it will go back to the same prices from Black Friday then? Lol.

Would be mad to upgrade now unless one has to, when the new tech is likely just a few months away now.

Hopefully the RTX 3070 does have 16gb. That would be cool.
 
Well someone might be - and a £500 RTX 2080 could tempt someone. Do hope AMD have high end competition to stop Nvidia taking the **** all over again
I would not even buy that. The RTX 3070 will likely have better RT performance than a 2080Ti and also benefit from continuing driver performance gains where as Turing will get a back seat just like Pascal did.
 
If they really want RTX to succeed they really gonna have to ramp up the performance big time instead of the past drip fed tech, otherwise no one will use it when these games tank frames..
My new pc is just waiting on a 3080ti.
 
Back
Top Bottom