• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
And Hardware unboxed has the 1080ti spanking the 2070 at 4k ultra nightmare (98fps vs 74fps). From comparision it is 98fps vs 84fps at ultra quality.

https://youtu.be/csSmiaR3RVE?t=794

no you've misinterpreted that. the 98/84 results were achieved using ultra nightmare settings on everything bar the texture pool size, which was set at ultra. this is an option that makes very little difference to visual quality. next to none.
 
no you've misinterpreted that. the 98/84 results were achieved using ultra nightmare settings on everything bar the texture pool size, which was set at ultra. this is an option that makes very little difference to visual quality. next to none.
I didn't write the word "texture" because i assumed that it would be obvious since that was what was being discussed but clearly it wasn't.

The reason doom is brought up because it puts to bed the argument that GPUs run out of horepower before they run out of VRAM. Clearly that isn't true and depends on the game

The visual issue has been discussed on the only information we have is that it reduces texture pop in by loading high detail textures earlier, however nobody is certain. The developers of this highly optmised game engine decided to take the time to implement and test this quality setting. So the developers clearly thought that it made enough of a visual difference to put the time into implementing and testing it. Alas unless we plan on asking the developers we will never truly know what it does.
 
I didn't write the word "texture" because i assumed that it would be obvious since that was what was being discussed but clearly it wasn't.

The reason doom is brought up because it puts to bed the argument that GPUs run out of horepower before they run out of VRAM. Clearly that isn't true and depends on the game

The visual issue has been discussed on the only information we have is that it reduces texture pop in by loading high detail textures earlier, however nobody is certain. The developers of this highly optmised game engine decided to take the time to implement and test this quality setting. So the developers clearly thought that it made enough of a visual difference to put the time into implementing and testing it. Alas unless we plan on asking the developers we will never truly know what it does.
Yea but some developers like to put tessellation under the water in games etc. Should we not be using our own eyes and common sense rather than thinking there must be a reason even though we can't see the difference?
 
I didn't write the word "texture" because i assumed that it would be obvious since that was what was being discussed but clearly it wasn't.

We weren't just discussing textures and you clearly said ultra settings. ultra setting is a preset. Those benchmarks were not done at ultra settings. So no, it wasnt obvious what your intention was, especially as you gave the numbers for "ultra quality" when in reality it was a nat's hair away from being full ultra nightmare quality.

The reason doom is brought up because it puts to bed the argument that GPUs run out of horepower before they run out of VRAM. Clearly that isn't true and depends on the game

an unrealistic texture pool size puts to bed no argument, especially when it makes almost no difference to visual quality at the cost of an addition what, 2gb of vram usage at 4k?

The developers of this highly optmised game engine decided to take the time to implement and test this quality setting. So the developers clearly thought that it made enough of a visual difference to put the time into implementing and testing it. Alas unless we plan on asking the developers we will never truly know what it does.

Are you sure about that? you know doom 2016 had the same silly options, right? Do you remember doom 3 having an option to run uncompressed textures? I do, and i remember it making no difference at all. ID have a history of adding options that don't always make sense, and they aren't doing it for the IQs.

You got one thing right - DE is a highly optimised game, to the point where some of the options make almost no difference to IQ.
 
Last edited:
Most of this type will be playing 1080P or 1440P (they've not realised 3080's shine at 4K only)

I just wanted to pick up on this point because I feel it's slightly misleading (unintended, I'm sure).

The 3080 appears to shine at 4k because that's the resolution with the lowest frame rate, for obvious reasons. In fact performance scaling doesn't improve with an increase in resolution, it improves with a decrease in frame rate.

Example: Death Stranding is so well optimized at 4k that you can see the GPU usage taper off as the frame rate approaches triple figures. I haven't seen this repeated with other titles at 4k yet, but only because they are not able to be driven so fast; I'll try to load up a couple of examples today and test that.

I suspect this is down to CPU capability and driver optimisation, although that's just an educated guess.
 
Yea but some developers like to put tessellation under the water in games etc. Should we not be using our own eyes and common sense rather than thinking there must be a reason even though we can't see the difference?
Where did i say that we shouldn't evaluate things ourselves? Also wasn't the tessellation issue a lack of optmisation?
Since you decided to bring it up, If you want to evaluate it, then do it properly. Buy the game. Play through the entire game (not just a small section) and flick between the two texture details. Then come back and give us your detailed breakdown. Don't watch some compressed youtube video and try to make a decision from that.


you said ultra settings. ultra setting is a preset.

Did I?

And Hardware unboxed has the 1080ti spanking the 2070 at 4k ultra nightmare (98fps vs 74fps). From comparision it is 98fps vs 84fps at ultra quality.

https://youtu.be/csSmiaR3RVE?t=794


an unrealistic texture pool size puts to bed no argument,

The difference between Ultra nightmare textures and ultra textures is about 1-2GB for a maximum usage of about 9GB (from memory). So i disagree on the notion that it is an "unrealistic texture pool size". It is a texture pool size that we will be reaching very soon, so it is clearly realistic or do you believe that games will never use more than 8GB of VRAM?

especially when it makes almost no difference to visual quality at the cost of an addition what, 2gb of vram usage at 4k?

Lets assume that there is no visual difference since it seems that none of us here own the game, so we can't really check. Just because something is not discernable to our eyes does not mean that it has no effect on the work that the GPU is doing or the code that is being executed. See TNAs tesselation example above. It made no visual difference but affected performance right? Therefore in reference to the orginal simplified argument (below); It is valid.

The reason doom is brought up because it puts to bed the argument that GPUs run out of horepower before they run out of VRAM. Clearly that isn't true and depends on the game
.
 
Are we nitpicking now? OK, ultra quality = ultra preset. You either meant to say ultra preset, in which case you were wrong, or you meant what you then later claimed you meant - that you were only talking about the texture pool size - in which case you were misrepresenting the results by calling them 'ultra quality' because every other setting in used in that particular benchmark was set to ultra nightmare. So which is it? did you misinterpret, or did you misrepresent?

So i disagree on the notion that it is an "unrealistic texture pool size". It is a texture pool size that we will be reaching very soon, so it is clearly realistic or do you believe that games will never use more than 8GB of VRAM?

It's unrealistic because it makes next to no difference. You can diasgree with that if you like. You wont change my opinion.

Lets assume that there is no visual difference since it seems that none of us here own the game, so we can't really check. Just because something is not discernable to our eyes does not mean that it has no effect on the work that the GPU is doing or the code that is being executed. See TNAs tesselation example above. It made no visual difference but affected performance right? Therefore in reference to the orginal simplified argument (below); It is valid.
Well that's a strawman isnt it? because obviously asking a gpu to do more of something will require more work. That isn't in question and it's not anything I've argued against. But futher to that, it reinforces my point - stop doing stuff that makes no difference to IQ, and stop using such games as examples of current games which can cripple 8gb cards. shock horror, throw enough caching at a card and you can cripple any GPU.
 
Are we nitpicking now? OK,
No, I was correcting you. Don't say that someone said something when they clearly didn't. Ironically doing that can be a strawman in certain situations.

Are we nitpicking now? OK, ultra quality = ultra preset. You either meant to say ultra preset, in which case you were wrong, or you meant what you then later claimed you meant - that you were only talking about the texture pool size - in which case you were misrepresenting the results by calling them 'ultra quality' because every other setting in used in that particular benchmark was set to ultra nightmare. So which is it? did you misinterpret, or did you misrepresent?

Lets see what i said
I didn't write the word "texture" because i assumed that it would be obvious since that was what was being discussed but clearly it wasn't..

Now lets apply what i said so we can see what i meant. In hindsight a comma is missing.

And Hardware unboxed has the 1080ti spanking the 2070 at 4k, ultra nightmare texture quality (98fps vs 74fps). From comparision it is 98fps vs 84fps at ultra texture quality.

https://youtu.be/csSmiaR3RVE?t=794

It's unrealistic because it makes next to no difference. You can diasgree with that if you like. You wont change my opinion.
.

I'm assuming no difference is in reference to image quality. My bad, i forgot human's can't see more than 8GB of VRAM /s

Don't worry i'm not interested in changing your opinion. I also don't have to, games devs will do that. Lets see you hold on to that opinion when you're buying 4080 with 16GB of VRAM.

Well that's a strawman isnt it?

No it isn't because the example directly relates to the original simplified argument and your statement about how it doesn't put the argument to bed (Reworded but retains the same meaning;)). Though i can see why you would think that since i split the sentence up.
 
It wasnt me who got the benchmark details wrong, chuk_chuk. give it a rest. And for the record, even if you meant 'ultra texture quality' you STILL be wrong because it wasnt any texture quality related setting that was changed. it was the pool size, as i keep saying. all the texture quality settings were set to ultra nightmare, not ultra. What excuse will you use next? :D

Bnuzde6h.jpg


Look at all those settings. they were all set to ultra nightmare bar the pool size in the link you posted. Do you think what you said was a fair representation of that? Rhetorical question, don't answer it as we already know you do. But you're wrong.


No it isn't because the example directly relates to the original simplified argument and your statement about how it doesn't put the argument to bed (Reworded but retains the same meaning;)). Though i can see why you would think that since i split the sentence up.
Yeah no. It really is. And i'm glad you're not interested in trying to change my opinion, because i'm sensible about these matters. Settings will be important to me when they make positive visual differences, and they will be completely irrelevant when they dont. And that means i wont ever use those daft settings as a way to 'prove' how games are already taxing new cards.
 
Last edited:
So much Epeen hanging around in this thread, it's been done to death already and the fact is 10GB is fine for this generation of card due to how it loads the textures in and out of memory so fast as required.....
It doesn't matter if a game requests 12GB of Vram the card will give the maximum it can and loads stuff in and out due to the way this architecture works negating the fact you need 12gb vram and have no effect on performance....

The ones claiming 10GB is not enough need to go read up on how the new Vram works with Nvidia's controller.... You also need to get a grip, the 3080 would run out of horsepower before Vram is an issue @4k if this was the 4080 I would agree but the fact is fact, it's a 3080 and it's can now loads on the fly in and out as fast as required in a scene..... Not that it's going to outside of one or two quoted scenarios where a user is modding a game to use textures no one should see the difference with the human eye while actually gaming vs a zoomed in screen shot with big circles highlighting the changes....


Honestly its really flipping old this conversation at this point, yes older generations of cards 10gb 'could be' an issue over the next 3-4 years but those cards cant even game @4k at the FPS we are discussing.

The 3080 it works differently, AMD have come up with their supper cache tech, Nvidia went brute force raw speed, same thing different approach. 10gb on 3080 is fine you are not going to hit a problem until the card is virtually obsolete anyway.
 
The real problem is, for £650+ there should be no question in the first place. The 3080 should have released with more full stop. Nvidia would not be bringing 20gb cards in the first place if they thought it was enough. When was the last time a Nvidia supposed flagship 80 card had 2 x the vram versions or even the 70.

Is 10gb enough though? The majority of users most likely wont run into any issues. The main issue is comparing to the last 2 flagships it has less Vram which does not sit well. If you compare it to the last two 80 non ti cards it has more. Pretty much why Jen calling it the flagship was a daft idea cause it's clearly not.
 
Oh woe is me, i got texture quality and pool size mixed up. What ever shall i do.

Since it nullifies the point you tried to make... Get it right in future? Or, you know, just agree that you misread it like I said initially and move on because now it's clear you didn't just misread it. Woe is you indeed.
 
The real problem is, for £650+ there should be no question in the first place. The 3080 should have released with more full stop. Nvidia would not be bringing 20gb cards in the first place if they thought it was enough. When was the last time a Nvidia supposed flagship 80 card had 2 x the vram versions or even the 70.

Is 10gb enough though? The majority of users most likely wont run into any issues. The main issue is comparing to the last 2 flagships it has less Vram which does not sit well. If you compare it to the last two 80 non ti cards it has more. Pretty much why Jen calling it the flagship was a daft idea cause it's clearly not.

Are you saying that Nvidia thought 10GB was enough then got it wrong? That could be a whole new 970 mess with cards having to be refunded.
 
Status
Not open for further replies.
Back
Top Bottom