• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Caporegime
Joined
4 Jun 2009
Posts
31,045

:cry:

Reviews and release fps charts are for apples to apples comparisons, so that a metric for % performance increase can be evaluated over earlier cards.

Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource. Anyone using review FPS to claim whether a card is 4k or not doesn't know what they are doing. Probably the same people that use gfx presets, where an experienced PC gamer will spend time setting up a game with all the settings available to get the best fps/IQ available for a particular game and those use higher resolutions. Using presets are for folk who doesn't understand all the graphical settings and what they are for so presets are a choice of 4/5 short cuts for folk not so clued up. The added bonus of PC gaming is the plethora of settings available for fettling to give you the best gaming experience, regarding your monitor refresh rate and IQ. You just cut your cloth accordingly to your set up.

Amazes me how many people that go on about 4k but have never gamed at it.

Exactly this. Even those people with all the vram in the world turn down settings because they want to achieve certain fps. As tna said, things like motion blur, DOF, film grain, lens effect, CA etc. etc. all go straight off.

Funnily I actually have more settings at max for 4k than I do when gaming on my 3440x1440 144hz display and this is largely because:

- it is easy to have a locked 60 at 4k when using dlss on the 3080
- where as on my 3440x1440 ips 144hz, I want to be getting at least a constant 80/90 or even 100+ fps depending on the game (in order to get some of the benefits from a 144hz panel, freesync is great but low fps is still low fps when it comes to motion clarity and input lag) and the only way to do this is to reduce settings
 
Soldato
Joined
21 Jul 2005
Posts
20,044
Location
Officially least sunny location -Ronskistats
Reviews and release fps charts are for apples to apples comparisons, so that a metric for % performance increase can be evaluated over earlier cards.

Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource. Anyone using review FPS to claim whether a card is 4k or not doesn't know what they are doing. Probably the same people that use gfx presets, where an experienced PC gamer will spend time setting up a game with all the settings available to get the best fps/IQ available for a particular game and those use higher resolutions. Using presets are for folk who doesn't understand all the graphical settings and what they are for so presets are a choice of 4/5 short cuts for folk not so clued up. The added bonus of PC gaming is the plethora of settings available for fettling to give you the best gaming experience, regarding your monitor refresh rate and IQ. You just cut your cloth accordingly to your set up.

Amazes me how many people that go on about 4k but have never gamed at it.

Good points. Nice constructive post without being personal! :) How it should be...
 
Soldato
Joined
21 Jul 2005
Posts
20,044
Location
Officially least sunny location -Ronskistats
Define how it's not a 4k card then? Come on, you have made this claim, so the onus is on you to prove it....

I guess the 6800xt, 6900xt and even the 3090 aren't 4k cards either? Since in certain games they too need to turn settings down.

better keep mining on the 3090 to make sure you get your value for money :cry:

Lost me. I have maintained the card as marketed as a 4k card for ages (just like the AMD 6800's upwards). Who said it is not one? Check image below like what AMD on its boxes (see the bottom left corner):
2011251047500.png

Its only you that seems to have an issue understanding. Get some new material. Again personal dig at the end, what has this to do with the topic...
 
Associate
Joined
31 Dec 2008
Posts
2,284
Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource.
I disagree with that as if that was the case some of those settings would be disabled on next gen consoles which are 4K (ish) but they are not.
Those settings are personal preference and some might like them whilst others might not.
 
Caporegime
Joined
4 Jun 2009
Posts
31,045
Good points. Nice constructive post without being personal! :) How it should be...

Personal... :cry:

Lost me. I have maintained the card as marketed as a 4k card for ages (just like the AMD 6800's upwards). Who said it is not one? Check image below like what AMD on its boxes (see the bottom left corner):
2011251047500.png

Its only you that seems to have an issue understanding. Get some new material. Again personal dig at the end, what has this to do with the topic...

Oh really... am I going to have to dig up some old posts now? ;) You have been insinuating that the 3080 is not a 4k card because of "it's issues at this res. with 10GB vram".... seems like because there has been zero proper evidence of this, you are now back peddling and stating that your take was always down to the "marketing" side only?

PS. still waiting on this:

Has there been any proof to show that the extra 2GB vram is actually benefitting the 3080 12GB? (outside of the "overall" specs actually being better than the 10GB version.....)

Since you made this statement:

Fingers in ears time again! :p Can't really argue now they have specifically made a 3080 12Gb edition. Think that cements the real way to remove the issue (2Gb headroom is needed) especially when you get to the cohort of 4k gaming and high textures.
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
I disagree with that as if that was the case some of those settings would be disabled on next gen consoles which are 4K (ish) but they are not.
Those settings are personal preference and some might like them whilst others might not.


On modern consoles you still select the resolution. So these options are still available for those with 1080 TV's or other monitor resolutions? I dont know what settings are available on a modern console. PS5 does 4k @ 120hz as an output, can it play older games at 4k 120fps?
 
Soldato
Joined
21 Jul 2005
Posts
20,044
Location
Officially least sunny location -Ronskistats
All it quotes is relating to my question (which is a valid one) about why nvidia are even bothering to release the 12Gb variant. I have never stated the 3080 is "not able to play 4k", its one of your latching things again, twisting to suit your interpretation. The OP is about discussing is 10Gb enough, which is what I am discussing. As said above by the others - you can tweak down the settings to mitigate any issues. What I mention is how are you going to know if there are any issues with VRAM if people are playing at 1440p or dropping settings? It needs the people with a 3080 to test that, and to push the VRAM limits you likely need to be playing at 4k to begin with and probe the high settings - otherwise you will not have any VRAM issues!
 
Soldato
Joined
30 Dec 2013
Posts
6,289
Location
GPS signal not found. (11)
All it quotes is relating to my question (which is a valid one) about why nvidia are even bothering to release the 12Gb variant. I have never stated the 3080 is "not able to play 4k", its one of your latching things again, twisting to suit your interpretation. The OP is about discussing is 10Gb enough, which is what I am discussing. As said above by the others - you can tweak down the settings to mitigate any issues. What I mention is how are you going to know if there are any issues with VRAM if people are playing at 1440p or dropping settings? It needs the people with a 3080 to test that, and to push the VRAM limits you likely need to be playing at 4k to begin with and probe the high settings - otherwise you will not have any VRAM issues!
Nvidia can probably make more money from the 12GB card than the 10GB version. Whether that's cause they can charge more due to a higher MSRP or it's easier to make I dunno.

For the OP question, my answer is yes. I have not once had to do anything due to only having 10GB of VRAM.
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
Essentially this thread is 246 pages of arguing over an issue that doesn't present itself except in poorly optimized new releases and to pay full RRP to be a beta tester for a new release, to get the best experience you need the top cards.

Aside this, the 12GB version isn't just more VRAM , it has more GPU clusters, CUDA's, RT cores, Tensors, bus, bandwidth, TDP.

You can make any game struggle with extra texture packs / mods etc etc to make it fall over. The 10GB card is enough and I haven't run into any issues, but nor have I tried to create any. I could have purchased FC6 but as the release that was beta was so poor they weren't for the first time getting me to pay MSRP for the game. Their loss. I'll get it when cheap like I do all my games now - when it's fixed and worth purchasing. Yes you could play FC6 straight away, but to mitigate the poor optimization, you needed to pay double for a card with more VRAM and the actual 15% increase in perf wasn't the issue. Sensible people will just wait until the game is fixed and save some money - like they did with a 3080 if you got one at a decent price.

On release there were no issues and the 3080 was the card to have - with double the money for the 3090 only giving 15% extra perf. I waited for the 3090 release to decide and the 6900XT which some said on the hype train will SMASH a 3090. Anyone can make a card have issues if you try and create them. Hell one game (New World) was even frying cards!

These aren't GPU issues - these are game issues. And what with the current quality of games on release over the past year or so - just confirms this. Hell, CP2077 still hasn't made it to PS5 with confirmed hardware.

You make the game for the gear not the gear for the game. HEnce why CP2077 missed out on all those PS5 sales. Even BF2042 is being refunded due to it's poor state even after the 2hr maxplay time has been exceeded for refund.
 
Last edited:
Associate
Joined
8 Oct 2020
Posts
2,329
All it quotes is relating to my question (which is a valid one) about why nvidia are even bothering to release the 12Gb variant. I have never stated the 3080 is "not able to play 4k", its one of your latching things again, twisting to suit your interpretation. The OP is about discussing is 10Gb enough, which is what I am discussing. As said above by the others - you can tweak down the settings to mitigate any issues. What I mention is how are you going to know if there are any issues with VRAM if people are playing at 1440p or dropping settings? It needs the people with a 3080 to test that, and to push the VRAM limits you likely need to be playing at 4k to begin with and probe the high settings - otherwise you will not have any VRAM issues!

It probably has more to do with production than actual performance. The 3080, 3080 Ti and 3090 all follow the same memory interface width, which allows them to use a single template across 3 cards.
 
Caporegime
Joined
4 Jun 2009
Posts
31,045
All it quotes is relating to my question (which is a valid one) about why nvidia are even bothering to release the 12Gb variant. I have never stated the 3080 is "not able to play 4k", its one of your latching things again, twisting to suit your interpretation. The OP is about discussing is 10Gb enough, which is what I am discussing. As said above by the others - you can tweak down the settings to mitigate any issues. What I mention is how are you going to know if there are any issues with VRAM if people are playing at 1440p or dropping settings? It needs the people with a 3080 to test that, and to push the VRAM limits you likely need to be playing at 4k to begin with and probe the high settings - otherwise you will not have any VRAM issues!

Ok lets keep to the topic then.

For the hundredth time now, post these issues you talk of with proof?

You do realise there are plenty of comparisons/reviews out there showing 3080 playing games at 4k with max settings etc.? But I guess they don't count for reasons???

And again, there are plenty of people on these forums who are playing at 4k i.e. me and we have all said there aren't issues unless in extreme limited cases like modding already very demanding games to provide 4-8k textures etc.

Essentially this thread is 246 pages of arguing over an issue that doesn't present itself except in poorly optimized new releases and to pay full RRP to be a beta tester for a new release, to get the best experience you need the top cards.

Aside this, the 12GB version isn't just more VRAM , it has more GPU clusters, CUDA's, RT cores, Tensors, bus, bandwidth, TDP.

You can make any game struggle with extra texture packs / mods etc etc to make it fall over. The 10GB card is enough and I haven't run into any issues, but nor have I tried to create any. I could have purchased FC6 but as the release that was beta was so poor they weren't for the first time getting me to pay MSRP for the game. Their loss. I'll get it when cheap like I do all my games now - when it's fixed and worth purchasing. Yes you could play FC6 straight away, but to mitigate the poor optimization, you needed to pay double for a card with more VRAM and the actual 15% increase in perf wasn't the issue. Sensible people will just wait until the game is fixed and save some money - like they did with a 3080 if you got one at a decent price.

On release there were no issues and the 3080 was the card to have - with double the money for the 3090 only giving 15% extra perf. I waited for the 3090 release to decide and the 6900XT which some said on the hype train will SMASH a 3090. Anyone can make a card have issues if you try and create them. Hell one game (New World) was even frying cards!

These aren't GPU issues - these are game issues. And what with the current quality of games on release over the past year or so - just confirms this. Hell, CP2077 still hasn't made it to PS5 with confirmed hardware.

You make the game for the gear not the gear for the game. HEnce why CP2077 missed out on all those PS5 sales. Even BF2042 is being refunded due to it's poor state even after the 2hr maxplay time has been exceeded for refund.

ding ding, winner winner chicken dinner
 
Associate
Joined
1 Oct 2020
Posts
1,146
I'm honestly really confused by all this. Is there a central point being debated/debunked/smacked down?

The argument has moved so many times I'm just lost. As far as I can see, 10gb is enough, and will be until it isn't.
 
Caporegime
Joined
4 Jun 2009
Posts
31,045
I'm honestly really confused by all this. Is there a central point being debated/debunked/smacked down?

The argument has moved so many times I'm just lost. As far as I can see, 10gb is enough, and will be until it isn't.

Not missing much tbf, just that we have a couple of people who keep on insinuating that the 3080 has issues at 4k etc. because of its vram yet they can't provide a single piece of evidence to back this up and anything, which they did provide was "debunked" easily i.e. resident evil village (uses more vram if possible but makes no diff. to performance), godfall (uses more vram if possible but makes no diff. to performance, in fact when RT was turned on, amd cards suffered/suffers some serious fps drops compared to ampere), fc 6 (texture rendering/loading issue that was confirmed by ubi developers and fixed a few months back), HZD (some randomer from reddit with very different setups and testing an old unpatched version of the game, the game had 3 patches since that time to address texture rendering/loading issues).

Awesome debunking! Better call Woodsta next time. :cry:

Still not able to address them points I see :cry:
 
Soldato
Joined
8 Jun 2018
Posts
2,827
It has become quite obvious that 10Gb is not enough for that tier of GPU. Some may disagree on this but when the manufacture decides to refresh their gpus to offer more then 10GB months before they release their next gen of gpus is a very telling sign that 10GB isn't enough. And that's on top of what was already discussed with titles at 4k, Texture Packs, etc.

 

Raz

Raz

Soldato
Joined
18 Sep 2003
Posts
5,184
Location
Nowhere
It has become quite obvious that 10Gb is not enough for that tier of GPU. Some may disagree on this but when the manufacture decides to refresh their gpus to offer more then 10GB months before they release their next gen of gpus is a very telling sign that 10GB isn't enough. And that's on top of what was already discussed with titles at 4k, Texture Packs, etc.

And it has nothing to do with selling 3080s GPUs at a much higher price, they're just concerned that gamers aren't able to currently play games at 4k with bells and whistles.

Nvidia, always looking after the gamers.
 
Caporegime
Joined
4 Jun 2009
Posts
31,045
It has become quite obvious that 10Gb is not enough for that tier of GPU. Some may disagree on this but when the manufacture decides to refresh their gpus to offer more then 10GB months before they release their next gen of gpus is a very telling sign that 10GB isn't enough. And that's on top of what was already discussed with titles at 4k, Texture Packs, etc.

Again, lets use that same logic here then:

We could use that same logic for anything, why do intel release so many cpus, why did amd bother releasing a 5800x 3d model? Why do monitor manufacturers release so many different models of monitors, which use the exact same panel??? etc. etc.

Guess amd felt that the 5800x wasn't up to the task in the cpu space either?

I guess we could also say the same when the 3090ti arrives, they felt the 3090 wasn't cutting it for gamers.

We're still waiting on all these issues at 4k with the 3080 10gb to be listed too..... Surely if it is such a problem, someone must be able to post some kind of evidence, right?
 
Status
Not open for further replies.
Back
Top Bottom