• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

In a mini-ITX case with limited case cooling and 350W board power??

Part of the FE cooler vents back into the CPU compartment,and it's hard to say if the RTX3090 will fit in the case.
He's not using a Mini-ITX case, he's using the Phanteks Enthoo Luxe Glass Midi Tower Case unless it's changed and i missed that post?
 
I've come to the realisation that with how taxing RT actually is, the consoles will be doing some sort of cut down version of it. Considering the chips inside them will not be the full fat chips we will have running in PCs.

A 2080ti for example struggles with watchdogs legion at 1080p. We think a console will run it at 4k with RT in full glory?

That also makes me think that perhaps a 3070 isn't as good as it looks. Yes... wow 2080ti performance for under £500. But if it has the same performance as a 2080ti and a 2080ti struggles with Watchdogs Legion with RT at 1080p then what of the 3070?

I based that opinion on this:

 
Last edited:
I've come to the realisation that with how taxing RT actually is, the consoles will be doing some sort of cut down version of it. Considering the chips inside them will not be the full fat chips we will have running in PCs.

A 2080ti for example struggles with watchdogs legion at 1080p. We think a console will run it at 4k with RT in full glory?

That also makes me think that perhaps a 3070 isn't as good as it looks. Yes... wow 2080ti performance for under £500. But if it has the same performance as a 2080ti and a 2080ti struggles with Watchdogs Legion with RT at 1080p then what of the 3070?

I based that opinion on this:

But you're also basing this opinion on a preview build 5 months before release (to which the build could have been even 1-2 months older than that)
 
On Tensor cores: https://www.youtube.com/watch?v=yyR0ZoCeBO8

You need to have them because real time ray tracing is so taxing on hardware that general purpose hardware cannot do ray tracing in real time, it's way too slow. .

Tensor cores are not used by Nvidia for Ray Tracing. They are used for Machine Learning tasks, like the RTX Voice and DLSS.

On Nvidia's original documentation Tensor cores were going to be used for denoising but it was never implemented in Turing. We will have to wait and see whether they got it working on the new generation of cards
 
I've just played my doom eternal and I get 135 to 140 fps on my system at 4k max.so the 2080 vs 3080 was gimped!
 
I've just played my doom eternal and I get 135 to 140 fps on my system at 4k max.so the 2080 vs 3080 was gimped!
Man You on 2080ti ?? Cause on titan pascal im hoovering around 65-70 with Everything maxed to max and no dynamic scaling resolution ofc. Would 2080 TI be 200% faster than watercooled pascal titan in doom ??
 
I've come to the realisation that with how taxing RT actually is, the consoles will be doing some sort of cut down version of it. Considering the chips inside them will not be the full fat chips we will have running in PCs.

A 2080ti for example struggles with watchdogs legion at 1080p. We think a console will run it at 4k with RT in full glory?

That also makes me think that perhaps a 3070 isn't as good as it looks. Yes... wow 2080ti performance for under £500. But if it has the same performance as a 2080ti and a 2080ti struggles with Watchdogs Legion with RT at 1080p then what of the 3070?

I based that opinion on this:

I thought the new gen consoles GPU was roughly 2080 super performance? well it is what i have read many times and it keeps popping up
 
Tensor cores are not used by Nvidia for Ray Tracing. They are used for Machine Learning tasks, like the RTX Voice and DLSS.

On Nvidia's original documentation Tensor cores were going to be used for denoising but it was never implemented in Turing. We will have to wait and see whether they got it working on the new generation of cards

I know that. But you need DLSS or some kind of intelligent upscaling equivalent to use ray tracing because you cannot do ray tracing at high resolutions, it's too taxing. This is fundamentally why Nvidia invested in DLSS in the first place, they knew they'd need high quality upscaling for the next 4-5 generations of video cards before we have the raw power to do ray tracing at full screen resolution in real time.
 
Read somewhere earlier that the 3080 is 60% better than the 2080S.

Assuming the 2080Ti is 20% faster than the 2080 Super then we can assume the 3080 is about 40% faster than the 2080Ti.

Real world, I wouldn't be surprised if the actual figure is circa 30%.

Nvidia are clearly leaving a gap in the market for the 3070Ti and 3080Ti. Think i'll hold onto my 2080Ti until then.
 
People keep forgetting pixel density apparently, so those with large TV's at 4k have the same or less PPI to those with smaller 1440p monitors..

I’ve seen 1440p and a 4k screens side by side. Pixel density of 1440p is not razer sharp like 4k. e.g. my ultra wide AW3418DW and my old CRG9 were not super sharp.
 
Back
Top Bottom