• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
If the 3080 doesn't have enough vram then the 3070 Ti definitely doesn't!!!

Yet I don't see a thread with disappointed 3070 Ti users.

I can't help wondering if people at the lower end are just more happy and content with life. High end users definitely moan more than low end users.
I have even less horse power with my 3070 and have been perfectly fine. It has run everything I have thrown at it no problems.

Oh and @opethdisciple I installed Resident Evil 3 recently and it runs maxed out 4K with highest textures just fine as I expected even with 8GB. Silky smooth 4K60 :D

Fancy a quick playthrough with all the guns I unlocked with unlimited ammo.
 
See, doing silly things like running 50+ mods (and some of the Skyrim Wabbajack compilations are more like 800 mods) is to me the whole reason I run a PC as opposed to a closed system like a console.

Complaint about people running mods seems very strange to me, as for me
No mods = console = save a fortune on PC hardware.
Tbf if your the guy who likes to run a tonne of mods then don't buy a 3080, it's not like your forced to buy one when you have alternatives like the 3090, 6800XT or 6900XT out there.
 
If the 3080 doesn't have enough vram then the 3070 Ti definitely doesn't!!!

Yet I don't see a thread with disappointed 3070 Ti users.

I can't help wondering if people at the lower end are just more happy and content with life. High end users definitely moan more than low end users. There's 230 pages of it now.

We actually did have that thread, but the answer was too obvious so instead everything coalesced around this thread, because the extra 2 GB of vram now puts you over the hump in enough cases where 8 GB wasn't enough. So because it's not as cut and dry for 10 GB as it was for 8, everyone can keep arguing about it and thus we have a very active thread.

https://forums.overclockers.co.uk/threads/is-8gb-of-vram-enough-for-the-3070.18903623/

People who went for the 8 GB cards simply accepted that it's a pitiful amount but were okay with the trade-off (vs AMD -> dlss/rt perf.; vs higher NVD cards -> cheaper) and for the most part can play around it, after all it's still only an issue in a few games/scenarios.
 
People who went for the 8 GB cards simply accepted that it's a pitiful amount but were okay with the trade-off (vs AMD -> dlss/rt perf.; vs higher NVD cards -> cheaper) and for the most part can play around it, after all it's still only an issue in a few games/scenarios.
+1

I have yet to hit one scenario yet myself and I doubt I will any time soon. If I do then how hard is it to drop texture setting one notch for that once in a blue moon case? Not like next gen cards are far away now. 1 year from now I will likely be on a 4070/80 which will have more VRAM anyway :D
 
Irrelevant to my post. I never said anything about console cost.

It's very relevant if you want to compare hardware because it has a big impact on what you're saying. There's clear benefits and costs involved in that deal for cheap mid tier hardware and you can't just accept the benefits and pretend like the costs associated with them don't exist or aren't relevant.

Talks about console prices and their cost, then speculates that MS and sony don't know how to build consoles and have decided to pay extra to put too much VRAM in them. Okay.
Maybe you should apply to work there, so you can show them how it's done Mrs remotely technical.

I never said they don't know how to build consoles. Sony and MS have the same constraints as Nvidia and AMD do when doing things like deciding memory size. You can't just arbitrarily pick any number you like, your architecture choices limit you to specific multiples and sometimes the next multiple down is not enough but the one above it is too much. Just like a card such as the 16GB AMD cards which get no where near full usage and with modern games run out of GPU grunt long before getting anywhere close to being full, and older examples like the 1080Ti with 11Gb that during most of its lifespan never used more than about 7-8Gb

Again if you look at WD:L for the consoles they do not use the high res texture pack the PC does, yet with that pack in use and absolute maxed out settings you can squeeze WD:L inside of 10Gb on the 3080. So memory isn't the issue on the consoles, it's other factors meaning they can't realistically use 10Gb of vRAM.

Also people still bringing up the 6GB of "slow" memory like it is literally unusable for running graphics. Yeah there is no way GDDR6 running at 336GB/s can be used for storing graphics items. :rolleyes:
Oh wait whats that, the 6700XT runs at 384 GB/s, literally unusable, what was AMD thinking.

AMD deliberately made a trade off with their GPU design and created "infinity cache" which trades away transistors for use in doing calculations on the GPU itself and instead uses them for much larger amounts of local high speed cache. The upshot of this is more data is kept on die and requirements on memory bandwidth to the vRAM is lower. This allows them to target cheaper GDDR6, but at the expense of wasting die space on cache that could be spent on say RT cores or something else.

The Xbox made that speed trade off for probably the same reason, slower memory is cheaper memory and if the whole memory pool doesn't need to be that fast because a large chunk is being used for more system RAM-like purposes then it doesn't need to be that fast. The only reason I mention this is because it re-confirms the calculations most of us had done about what % of memory is realistically going to be used by which parts of the system and how much you can realistically consider the equivalent of vRAM.

It really depends on a few more things and should be evaluated on a game by game (arguably game model by game model) basis. You can't make any blanket statements on it. At least you didn't say 4k textures should only be used when gaming at 4k.

Sure you can get a more detailed comparison going game by game, but all 3d rendering in general suffers from the same limitation which is that there's no use expanding texture detail above what can be rendered to the screen at the screen resolution you're in. At some point higher res textures always become pointless. Because the consoles can't really run next gen games like WD:L in high resolutions and are less and less likely to in future with AMDs FSR, they're going to be more limited in the detail of the textures they can see benefit from, especially compared to something like native 4k on the PC.

You sprinkled a bit of truth but interlaced it with falsehood. The hypothesis was that VRAM requirements for games would grow due to the consoles having more VRAM. It was mentioned that at a point the 3080 would not be able to run better textures than the consoles because it doesn't have the VRAM to store them.

We are a year in with video games still catering for the previous generation of consoles. With the last few games seeming to grow in requirements. Unless you have proof that these new consoles have peaked and they won't be getting any better.

Yes it will grow (from the previous generation) but most people here who have discussed the consoles seem to have converged on the same expectation which is that they wont have access to about 10Gb of vRAM, and again the MS architecture and the speeds of the memory seems to also fit nicely with this estimate.

Whether I'm right in my additional hypothesis about the consoles likely not being able to make good use of 10Gb for vRAM purposes remains to be seen, we need tools to measure memory usage on the consoles ideally. But if we take a modern game targeted at the next gen systems (making use of RT features and which push the best in class PC dGPUs to their limits), a game like WD:L, then so far I've been vindicated on my hypothesis. A 3080 can run more or less maxed out 4k with high res texture pack. The consoles have massively cut down visual effects (as outlined by DF) and lower screen resolution, and use lower quality textures. Despite all of that memory...something that we don't really have an explanation for if the consoles can handle it...and the assets already exist. Whats the alternative theory that would explain this?
 
The *GPU itself* "is not enough" to max my driving sims in VR.

Why is there no thread for *that* "problem"?

Why is it presumably okay for the GPU to not deliver max settings but a gaming apocalypse if the vram buffer doesn't deliver max settings?

Is the goal max settings or not?
 
I've been testing your tip about changing texture setting from High to Ultra on my GTX 780 3GB with different settings, and it seems like 3072 MB is not enough for both RDR2 with Ultra textures AND Win10

Is anyone here defending the 780 3GB? That was a classic case of the architecture limiting memory configs to multiples of 3 so it was either 3Gb which probably wasn't going to be enough long term, or 6Gb which was probably way too much long term. And it was so close to the line on which route you go that in the end we saw both 3Gb and 6GB variants of the 780. So it shouldn't be surprising to anyone to see the 3Gb variant suffering all this time later. The PS4 had a total of 8Gb about 6Gb of which was realistically available to be used as vRAM so it's no surprise to see it beat a 3Gb card on texture quality. That in no way reflects the situation with the PS5 and the 3080.

Been following this thread for what seems like months, and just want to say that PrincessFrosty has nailed this. The RAM is fine for the GPU power. What is it enough FOR is the question? Is it running RDR2 at 4k at 144hz? No. Are any of the AMD cards with 16GB doing it? No. Fair enough, is the 3090 doing it with 24GB? No. The RAM is not the limit.

Yes you can put more in, but it doesn't seem that it's actually needed yet. Will it be enough in the future? Well no, of course it won't. How long will it last? How long has a top GPU ever lasted? Who knows, but the amount of people that couldn't get a latest GPU who are saying that they are fine with their 980 etc. suggests that games still work for a long time yet.

Would it be better if they had put in 16GB, charged more, and you received no benefit? AMD have put more RAM in, but it is slower RAM. Is the speed of the RAM holding it back? Doesn't seem to. At what point do we say "Actually, that's the right amount and a reasonable balance between price and performance", assuming RRPs were reasonable?

Speed of the RAM being lower means the memory bandwidth is lower, to balance this fact AMD invented Infinity Cache to keep more data locally on the die and require less transfer to/from the vRAM. One possible downside of this is their performance at 4k, they seem to do very well against the Nvidia counterparts at 1080p especially, and OK at 1440p but at 4k there's noticeable gap which to me suggests infinity cache isn't quite able to compensate for the high memory bandwidth requirements of 4k. Infinity cache also means less transistors on the die to spend on other things which is partly why they have less dedicated towards RT and why RT performance is generally worse. So there's tradeoffs as you suggest.

AMD clearly suffered the same problem as Nvidia did, they were limited to 8Gb or 16Gb configs and knew 8Gb would be too little and so had to run with 16Gb which is almost certainly too much. From their point of view if they can't spend that vRAM on something useful then all they did was made their card more expensive, which is not ideal. But it comes down to the same fundamental problem, you can't just pick any old amount of vRAM to use. Memory just isn't that fast to keep up with these GPUs alone, you have to write to lots of memory modules in parallel to get the total memory bandwidth up, which leads you to have to pick multiples of memory modules, which are only made in certain sizes. In turn this leads to configs that can be way less, or more, than what is ideal for the GPU.

All parties suffer this same problem, Nvidia, AMD and the APUs used in the consoles, sometimes generations of tech work out better than others for how neatly together the GPU and memory work and sometimes they don't and you're left with an awkward problem. Like Nvidia and the 780, and in the end just pass the decision to the end user to pick between 2 models. If the 3080 was a lot closer to the line of what is unacceptably low amount of vRAM then you'd have seen a similar choice between say a 10Gb and 20Gb variant but that never materialized.
 
The *GPU itself* "is not enough" to max my driving sims in VR.

Why is there no thread for *that* "problem"?

Why is it presumably okay for the GPU to not deliver max settings but a gaming apocalypse if the vram buffer doesn't deliver max settings?

Is the goal max settings or not?
Basically it fits the narrative some want to push :D

Many years down the line they will sing and dance "see, we was right", but by then we will be on 4000 or 5000 series GPU's :cry:
 
Basically it fits the narrative some want to push :D

Many years down the line they will sing and dance "see, we was right", but by then we will be on 4000 or 5000 series GPU's :cry:

b6a.jpg
 
Im just going to chime in and say that i have a 3070 rig with a 3900x and a 2080Ti rig with a 3600x and both basically play everything pretty well maxed with no issue @ 4k. Honestly i think this thread is full of people who go "well i want to turn on the pointless performance killer because i spent 4 figs on a graphics card". And i understand this, i once bought SLI 7950s for some insane sum and was gutted that i couldnt get Painkiller running at whatever was the dogs balls settings of the time. But it was pointless! I ended up downgrading to an xbox 360 and was happy for years. Its all meaningless guys. No one cares about a 1000 graphics card it doesnt entitle you to anything. Might aswell be sticking a spoiler on your 12 year old mondeo ST TDCI and complaining that you cant lap the nurburg ring in a sensible time. Face it you overspent on fomo hype. Enjoy the games or sell it and profit. Take it from an increasingly old man that this ragey dialogue will get you no happiness
 
Basically it fits the narrative some want to push :D

Many years down the line they will sing and dance "see, we was right", but by then we will be on 4000 or 5000 series GPU's :cry:

I have to turn down settings *now*. I don't have to wait for the 4000 series. The GPU itself "is not enough"....now. I didn't think it would be enough, and it's still a great improvement over my 1080Ti, so I'm still happy.

Some day the vram buffer will not be enough, but *today*...right now...the GPU itself is already "not enough", so I don't get the obsession with vram.
 
I have to turn down settings *now*. I don't have to wait for the 4000 series. The GPU itself "is not enough"....now. I didn't think it would be enough, and it's still a great improvement over my 1080Ti, so I'm still happy.

Some day the vram buffer will not be enough, but *today*...right now...the GPU itself is already "not enough", so I don't get the obsession with vram.
Yeah, I get that and I completely agree. But I was referring VRAM running out. As you can see there are quite a few people panting waiting for the game that comes out (ideally one worth playing) so they can say I told you so, 10GB is not enough bla bla bla :p

We are likely only a year away now from next gen cards. I really can’t see anything changing, certainly not for me anyway, only game that will be taxing that is on my radar until then is Dying Light 2. Maybe Bloodlines 2 also if it does not get delayed again.
 
Yeah, I get that and I completely agree. But I was referring VRAM running out. As you can see there are quite a few people panting waiting for the game that comes out (ideally one worth playing) so they can say I told you so, 10GB is not enough bla bla bla :p

We are likely only a year away now from next gen cards. I really can’t see anything changing, certainly not for me anyway, only game that will be taxing that is on my radar until then is Dying Light 2. Maybe Bloodlines 2 also if it does not get delayed again.

dying light 2 is in delay hell, probably rtx5090 is out by then - never buy a graphics card for a future game you're planning to play, wait until the game is out and then get a gpu
 
dying light 2 is in delay hell, probably rtx5090 is out by then - never buy a graphics card for a future game you're planning to play, wait until the game is out and then get a gpu
I upgrade pretty much every gen, actually multiple times per gen. That is one of the reasons this 10GB is not enough malarkey did not bother me.

If that was the goal then there would be no "Graphics Settings" section in any game; they're there to be turned down not up.

Anyone who struggles with that concept is confusing PC gaming with console gaming. :)
There are loads of people who chase it though, many do not even understand that max settings can be worse image quality than actual making use of those settings and turning off crap like motion blur, depth of field etc.
 
I upgrade pretty much every gen, actually multiple times per gen. That is one of the reasons this 10GB is not enough malarkey did not bother me.


There are loads of people who chase it though, many do not even understand that max settings can be worse image quality than actual making use of those settings and turning off crap like motion blur, depth of field etc.

The problem is most gamers just want to max out the settings and start playing.
 
The problem is most gamers just want to max out the settings and start playing.
Yeah. But end of the day if one knew what’s what, they would want max image quality. I mean that is one of the biggest reasons we upgrade after all. While setting things to max, instead of putting crap like motion blur etc on, you leave it off. Not like that would take any longer.
 
Status
Not open for further replies.
Back
Top Bottom