• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

As for performance, on average it's 5% slower than the 3090, attributable to slightly less memory bandwidth and slightly lower effective clock speed from having a lower power limit than the 3090

Will it be any cheaper than the 3090 though? If supply is low, and it's in high demand I could see it approaching 3090 prices.
 
The demand is at such a fever pitch that if a 3080Ti is released then it will surely be going for hugely inflated prices.

Then I, for one, will leave it. My 2080 Ti is perfectly good enough for now and will keep me going until the 40xx or 7xxx series arrive. I suspect many people will do the same.

Actually, there's a modest chance my next discrete GPU will be from Intel. If the leaks are true then Intel could have a very competitive product. It just comes down to price.
 
In my opinion though a 3080 is capable of 4k obviously but really it's a 1440p GPU.

Why? Well in the land of next gen titles a 3080 in Watch Dogs Legion or CyberPunk or any other RT title will only get 60 -80 fps with Ultra settings.

And that's the default game defined ultra, I don't mean the extra little bits you can turn on in the options menu.

For me when I play a game I am happy with the games definition of Ultra. I generally wont push it beyond that.

And at 1440p at ultra in RT titles a 3080 will get around 60 - 80 depending on DLSS quality.

So I don't regard it at all as a 4k GPU.

Also from benchmarks I've seen a 3080 in Valhalla at 1440p which is a non RT title also only gets around 80 fps.

---

I see a lot of these 4k specs for RT in the new titles are for 4k 30fps.

But who on PC wants to play at 30fps!
 
In my opinion though a 3080 is capable of 4k obviously but really it's a 1440p GPU.

Why? Well in the land of next gen titles a 3080 in Watch Dogs Legion or CyberPunk or any other RT title will only get 60 -80 fps with Ultra settings.

And that's the default game defined ultra, I don't mean the extra little bits you can turn on in the options menu.

For me when I play a game I am happy with the games definition of Ultra. I generally wont push it beyond that.

And at 1440p at ultra in RT titles a 3080 will get around 60 - 80 depending on DLSS quality.

So I don't regard it at all as a 4k GPU.

Also from benchmarks I've seen a 3080 in Valhalla at 1440p which is a non RT title also only gets around 80 fps.

---

I see a lot of these 4k specs for RT in the new titles are for 4k 30fps.

But who on PC wants to play at 30fps!


Hmm, most graphics options in game are to make potato resolutions look better. You can turn off many post processing options at 4k. If you are going into a game and setting ultra as default, you are doing it wrong. You tweak your CPU, memory and GPU, but use a default ULTRA in a game? You'll get way more performance releasing the performance hits from rubbish like Motion blur and depth of field. Maybe you like the end result. much of POST PROCESSING needs to be off @4k.:)
 
Also ideally at high FPS on a High HZ LCD because well... it is LCD at the end of the day so poop.

60HZ on an LCD does not match the fluidly of a CRT/Plasma nor does 240hz TBF.
 
Depends on what you class as acceptable. 30fps is playable on games that are not fast paced like some RPG's. To state 4k 30fps is not a PC gaming experience is strange.. as you can (for the reason of being on PC platform) tinker with the settings and say upscale 2k with some settings toned down and get a really good quality bumping the fps up.
 
Hmm, most graphics options in game are to make potato resolutions look better. You can turn off many post processing options at 4k. If you are going into a game and setting ultra as default, you are doing it wrong. You tweak your CPU, memory and GPU, but use a default ULTRA in a game? You'll get way more performance releasing the performance hits from rubbish like Motion blur and depth of field. Maybe you like the end result. much of POST PROCESSING needs to be off @4k.:)

Yea I know what you mean. But you only start making compromises when your hardware starts to age.

That's just me though.
 
Also lets take the latest new RT title The Medium.

Taken from here:

"With DLSS 2.0, our RTX3080 was able to offer constant 60fps at both 1080p and 1440p."

So again, at 4k it will probably be around the 30 fps mark.

The NEXT GEN titles render the 3080 a 1440p card.

Unless one is happy with 30-40 fps performance that is.

Which don't get me wrong is completely playable especially with a gsync / freesync display.
 
The 3080 was always on a thin line of reasonable due to the absent VRAM, what I mean there is at 4k your likely wanting as much memory as possible - especially since you mention 'hardware starts to age'. OK you will likely get away with it for the next year but I am waiting for this new garlic bread of SAM/Resizeable Bar to see if it can be a worthy 'free' feature - yet on a 3080 is pretty lacklustre as it doesnt have much of a pool to share.
 
But I've just told you the reasons why.

How is it wrong?

4k 30fps on the PC is not a PC gaming experience.

That's for consoles.

That's your choice to enable RT, a technology still in it's infancy.

4K 60+ is all that's required to be considered a real "4k card" and that's possible on the RX 6800 in most games let alone a 3080. Change the settings to "High" and we get even better FPS.
 
That's your choice to enable RT, a technology still in it's infancy.

4K 60+ is all that's required to be considered a real "4k card" and that's possible on the RX 6800 in most games let alone a 3080. Change the settings to "High" and we get even better FPS.

I'm referring to RT specifically as a next gen feature.

The old rasterised only games will fade out soon enough leaving only games with RT features as standard.
 
Back
Top Bottom