• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
What if you are happy to play at 40-60 at 4K maxed on games like watchdogs or gta etc but now you have to reduce texture quality purely because of a lack of vram.

Anyway the 3080ti is warming up so the 3080 is getting closer and closer to midrange :p
Then if that happens and it is a problem for you, you upgrade to a card with more vram. 3080Ti will likely be coming, feel free to pay £300 more for the extra vram an 5-10% extra performance :p

Unless 3080ti is faster than a 3090 then that is not possible for 3080 to go midrange as there is already a tiny gap between the 3080 and 3090 anyway.
 
Do I think that a flagship card of a new generation is going to have enough power to max a game of this or the new generation? If not... don't you think it would be a bit... weird? At 1440p at least this should be achievable to fully max any game at Ultra. 4k60fps should be achievable in the majority of well developed and optimised games.

RT is the only thing that I have my doubts about any card from Nvidia or AMD fully maxing depending how its implemented.


Both are Ubisoft unoptimised pieces of crap. They do it every single generation.

So "no" then, right?

I don't understand why not being able to max out ALL games is excusable until vram is the cause. When the cause is a lack of GPU power, it seems to be acceptable for some reason?
 
So "no" then, right?

I don't understand why not being able to max out ALL games is excusable until vram is the cause. When the cause is a lack of GPU power, it seems to be acceptable for some reason?
Probably because you can potentially unlock a few more fps by overclocking/water-cooling etc but when your vram is maxed out you have nothing left to do but reduce settings.
 
Probably because you can potentially unlock a few more fps by overclocking/water-cooling etc but when your vram is maxed out you have nothing left to do but reduce settings.


Project CARS 2, an old title, runs in the high 20's on my 1080Ti/HP Reverb combo when maxed. It only uses a little over 7gb of vram when it's maxed out, but unless an overclocked 3090 is 3 times as powerful as my 1080Ti, I will have to turn down settings no matter what card I buy.

GPU horsepower is currently insufficient for this task. "Flagship" or not.
 
I don’t think I run max settings on any game I play as FPS and reduced input lag that comes with it is more important to me than eye candy.
Obviously there is a point where lowering settings make the game visibly worse.
That is on 100hz 3440x1440 screen and 3080. Do I think 10GB is not enough.? Yes probably but at £649 that I paid I’m happy with that and looks and build quality of the FE card is miles better than any other card.
 
I do lfind it interesting how the narrative is subtly and slowly changing from where it started at "10GB VRAM is enough for this generation, period" to statements like: " Issues wont exist with most games. At most you wont be able to run the higher texture settings."

10GB VRAM is enough for this generation, period.

Ok, I don't know who you think said that but that was not the narrative at all, certainly no general consensus. Depending on who you listen to, it's either a) 10Gb isn't enough and nvidia are a laughing stock or b) we don't know but it hasn't proven to be an issue YET.

Remember, this thread started asking a question, and answers of 'it's not enough because (x game) were being given without proof and it wasnt until other people started looking in to these games that holes started appearing in that argument. If anybody is getting the impression that I, or princessFrosty, or anybody else who's taken the time to do at least some investigation (and in all fairness, the majority of it was frosty - he's got waay more patience than i have with the abuse he's been getting), thinks that 10Gb will always be enough then you have got the wrong impression. Stop thinking that. What we have said is that, so far, there's no indication that any of the games given as examples so far are reasonable examples of a game that genuinely a) needs more than 10gb of vram and b) actually makes use of it.

Take this latest example, watchdogs at 1080p. people are going nuts about the 3070 tanking at 1080p and yes, it really does...but by the looks of it, so does the 11gb 2080ti at the same resolution and using the same settings. So is that a genuine vram limited scenario or is it just a as-per-usual terribly optimised game? We need more data on that to make any kind of conclusive decision.

RichDog said:
Do I think that a flagship card of a new generation is going to have enough power to max a game of this or the new generation? If not... don't you think it would be a bit... weird? At 1440p at least this should be achievable to fully max any game at Ultra. 4k60fps should be achievable in the majority of well developed and optimised games.
You've asked a question along these lines already and i've answered it once. There are already games that the 3080 cant manage 4k60 on, and not because of VRAM. Where's all the fuss about that? Why are people bothered about turning down the texture settings one notch (especially in cases like Doom eternal where it makes ZERO difference...) but anything else gets a pass and gets ignored? Doesn't make any sense to me. What's worse is that this isnt even unique, I can't remember a time when a new card has ploughed through everything with max settings. There's always been a game or two that drops the average FPS below 60 at launch. This is, or was, perfectly normal. But suddenly now it isnt? What's changed then? Damned if i know...
 
Last edited:
So based on techpowerup’s review of Assassins creed Valhalla, not only does it look better than next gen consoles on PC, it looks better than watchdogs legion and only uses 6GB of vram. LOL!



10GB VRAM is enough for this generation, period.

Ok, I don't know who you think said that but that was not the narrative at all, certainly no general consensus. Depending on who you listen to, it's either a) 10Gb isn't enough and nvidia are a laughing stock or b) we don't know but it hasn't proven to be an issue YET.

Remember, this thread started asking a question, and answers of 'it's not enough because (x game) were being given without proof and it wasnt until other people started looking in to these games that holes started appearing in that argument. If anybody is getting the impression that I, or princessFrosty, or anybody else who's taken the time to do at least some investigation (and in all fairness, the majority of it was frosty - he's got waay more patience than i have with the abuse he's been getting), thinks that 10Gb will always be enough then you have got the wrong impression. Stop thinking that. What we have said is that, so far, there's no indication that any of the games given as examples so far are reasonable examples of a game that genuinely a) needs more than 10gb of vram and b) actually makes use of it.

Take this latest example, watchdogs at 1080p. people are going nuts about the 3070 tanking at 1080p and yes, it really does...but by the looks of it, so does the 11gb 2080ti at the same resolution and using the same settings. So is that a genuine vram limited scenario or is it just a as-per-usual terribly optimised game? We need more data on that to make any kind of conclusive decision.


You've asked a question along these lines already and i've answered it once. There are already games that the 3080 cant manage 4k60 on, and not because of VRAM. Where's all the fuss about that? Why are people bothered about turning down the texture settings one notch (especially in cases like Doom eternal where it makes ZERO difference...) but anything else gets a pass and gets ignored? Doesn't make any sense to me. What's worse is that this isnt even unique, I can't remember a time when a new card has ploughed through everything with max settings. There's always been a game or two that drops the average FPS below 60 at launch. This is, or was, perfectly normal. But suddenly now it isnt? What's changed then? Damned if i know...
Give it a few days, he will be back again saying the same things. As I said, it seems like the fella is a few cards short of a full deck :p
 
You deserve this!

:D :p

Good memories. :)

Good memories indeed :D.

The good ol' days when GPU stock was better and a handshake was welcomed :)



I see this as me honouring the great LtMatt. To some people in this world, you're a legend. (least you had the balls to have a laugh, Tommy did too until the Scottish winds hit him and I think they returned inside)

:D :D

:D
Balls of steel mate, got loads of padding to keep them toastie!

Still got my pic up on the wall @LtMatt ?
HofiDjz.jpg

We had way more banter back then, could be at each others throats when there was a disagreement and have a laugh about it.
 
Jesus. I've got to say, you're incredibly devoted to defending the 3080's 10GB VRAM issue. So many posts from you in this thread, fearlessly defending your 10GB honour.

I wonder how long you'll keep it up, weeks, months, years? :eek:
The second I dont think its true I will say so, I flow with the information. If the information available changes I will.
 
What if you are happy to play at 40-60 at 4K maxed on games like watchdogs or gta etc but now you have to reduce texture quality purely because of a lack of vram.

Anyway the 3080ti is warming up so the 3080 is getting closer and closer to midrange :p
How so? The 3080 is only 8% slower than the 3090 at 1440p. Custom AIB models like the Strix match the stock 3090.

They should call this the 3080 Super, not Ti as the performance is not increasing enough to warrant the Ti moniker.
 
:D
Balls of steel mate, got loads of padding to keep them toastie!

Still got my pic up on the wall @LtMatt ?
HofiDjz.jpg

We had way more banter back then, could be at each others throats when there was a disagreement and have a laugh about it.

LOL, classic. I am gonna save that one. :D :D :D
 
Remember, this thread started asking a question, and answers of 'it's not enough because (x game) were being given without proof and it wasnt until other people started looking in to these games that holes started appearing in that argument. If anybody is getting the impression that I, or princessFrosty, or anybody else who's taken the time to do at least some investigation (and in all fairness, the majority of it was frosty - he's got waay more patience than i have with the abuse he's been getting), thinks that 10Gb will always be enough then you have got the wrong impression. Stop thinking that. What we have said is that, so far, there's no indication that any of the games given as examples so far are reasonable examples of a game that genuinely a) needs more than 10gb of vram and b) actually makes use of it.

Take this latest example, watchdogs at 1080p. people are going nuts about the 3070 tanking at 1080p and yes, it really does...but by the looks of it, so does the 11gb 2080ti at the same resolution and using the same settings. So is that a genuine vram limited scenario or is it just a as-per-usual terribly optimised game? We need more data on that to make any kind of conclusive decision.

You've asked a question along these lines already and i've answered it once. There are already games that the 3080 cant manage 4k60 on, and not because of VRAM. Where's all the fuss about that? Why are people bothered about turning down the texture settings one notch (especially in cases like Doom eternal where it makes ZERO difference...) but anything else gets a pass and gets ignored? Doesn't make any sense to me. What's worse is that this isnt even unique, I can't remember a time when a new card has ploughed through everything with max settings. There's always been a game or two that drops the average FPS below 60 at launch. This is, or was, perfectly normal. But suddenly now it isnt? What's changed then? Damned if i know...

All claims of large vRAM requirements have so far been debunked with evidence, demonstrating that games of this generation really do not need more than 10GB, in fact the average of AAA games for the last few years is about 6GB. I think the only people left who seriously disagree with this are people who are ignorant of how real memory usage works, and base memory usage off what is allocated.

But in addition to that, I think I've started to build a case for another claim, that these cards will be reasonably well future proofed from a memory standpoint. That comes from looking at the most demanding games today in 4k Ultra which include things like FS2020, Crysis Remastered, Avengers, Watch Dogs Legion and seeing that not only is 10Gb of vRAM enough for a 3080, but that the GPU struggles to run these games at 4k Ultra and the settings need to be dropped down. They're an excellent example of a GPU bottleneck. I've also done quite detailed comparisons to the console GPUs which are a good barometer of the games to come in the next generation both in terms of them having the same amount of effective vRAM, but also the direction they're going in with their new technology. By investing big into disk speed and asset streaming with DirectStorage and fast disks and memory controllers.

So there's speculative claims both ways about future use, but at least mine has some evidence and reason. And I keep an open mind about future games, let's just get them and measure them and see what is what. But from all of my testing so far a pattern has emerged that is quite noticeable.
1) Speculative measurements of vRAM are usually wrong, allocated vs used.
2) The used is always less, and usually way less.
3) As games push towards the 10GB limit of real usage, the GPUs choke first.

The biggest win out of all this in my opinion is the revelation that we've been measuring this badly for a long time and now finally have tools to allow us to do it properly, and that's a good thing. We can start adapting our expectations to be more reasonable and sensible. Everyone benefits from that.
 
You know we are at the start of a new gen and the concern is not the current (and about to be previous) gen... right?

Sure, my last post acknowledged that. I said 2 things:
1) That current gen is fine by all available evidence.
2) An argument can be made for next gen being fine as well, and I gave a summary of my reasons for why I think that.

I didn't infer from the current gen that the next gen would be fine, I have separate reasons for believing both of those things which I've listed.
 
Sure my last post acknowledged that. I said 2 things:
1) That current gen is fine by all available evidence.
2) An argument can be made for next gen being fine as well, and I gave a summary of my reasons for why I think that.
An argument can indeed be made for anything... whether it actually makes sense or not is another matter. The fact is you do not know if 10GB will be enough, and I do not know that it won't. However, the risk is there and it is more than plausible... plausible enough to make me very happy that I didn't spend 800 quid on a 10GB 3080 "flagship" card. The rumours of Nvidia now rushing out revised "Ti" models next year to compete with AMD, who did the right thing and put 16GB on their flagship cards, seem to confirm this suspicion. Much of the hardware community and a fair number of professional reviewers also seem to agree.

At this point, I think everything has been said by both sides and lets just see how it plays out. I will be looking forward to reading news that relates to this. :)
 
Status
Not open for further replies.
Back
Top Bottom