• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
UE5 seems to be doing quite well with its usage and that's before optimisation. What do you see using more VRAM in the next 1-2 years?

What do you consider the average amount of VRAM to be today and why would it explode in the next 1-2 years? There are a lot of people still using 1060 level hardware according to Steam's hardware survey.

I do expect to upgrade to Lovelace/RDNA3 due to lack of GPU grunt mostly centred around raytracing. That also means that the 3090 owners will also have to upgrade, which is why I settled for a £720 3080 ;)

Sadly, RDNA2 cards are already lacking today due to the budgets placed by consoles. Lumen running software rendering looks good, but can't touch hardware raytracing. And of course there is the Tensor backed DLSS that RDNA2 has still to compete with as we approach 8months since launch.

UE5 seems to be doing quite well with its usage and that's before optimisation. What do you see using more VRAM in the next 1-2 years?
Maybe ray traced lighting and shadows at 4K? you know the thing you've been wetting yourself about?

What do you consider the average amount of VRAM to be today and why would it explode in the next 1-2 years? There are a lot of people still using 1060 level hardware according to Steam's hardware survey.
Guess we are at around 8GB since the 3070 tanks in some games at 4K. It might "explode" if we see some more games with real ray tracing and no crippling memory optimizations that make people disappear in front of you.

I do expect to upgrade to Lovelace/RDNA3 due to lack of GPU grunt mostly centred around raytracing. That also means that the 3090 owners will also have to upgrade, which is why I settled for a £720 3080 ;)
GPUS can run out of grunt for various different reasons. I hope you've taken care of that IVY bridge system before you waste any more money on next next gen GPUs.;)

Sadly, RDNA2 cards are already lacking today due to the budgets placed by consoles. Lumen running software rendering looks good, but can't touch hardware raytracing. And of course there is the Tensor backed DLSS that RDNA2 has still to compete with as we approach 8months since launch.

Sadly Nvidia also placed console like budgets/limitations on their new GPUs by choosing to match the vram on their flagship with the vram on the new consoles. Even though you've tried to post about it as some kind of positive spin as to why games won't use more than 10GB. (If you want to be limited to budget console settings on your 3080 then it sure is a positive:cry:)

I really would have thought someone like yourself would be delighted for devs to move on from these LEGACY games and make a true NEXT GEN experience with no LIMITATIONS :cool:

Don't worry though DLSS will save you until Nvidia decide it's time to upgrade and the performance benefits start to decrease.:(
 
Same things were said for 1060. Everyone argued that realistically they would never be useful above 4 GB. Same for 8 GB Rx 580s.

They still perform okay enough for a lot of people, but most importantly, their VRAM is at a sweetspot (unlike what people believed back then). A 1060's VRAM will be maxed out by pretty much every AAA game as of 2019-2021, even if you put everything to medium. You can push Ultra textures regardless of other settings with both GPUs and have good looking games. Instead, you could have bought a 4 GB RX 580 variant or 3 GB 1060 variant by saying "these chips will not make use of 6-8 GB anyways" and make huge sacrifices on texture quality to make the game barely playable with inconsistent frametimes due to games requiring 5.5+ GB VRAM even at 1080p as of 2020.

Funny thing is, these RTX GPUs have DLSS in their arsenal. You can make RTX 3090 store 8k-16k textures and render at 1440p, 4K and upscale with DLSS and have gorgeous graphics. So yes, any RTX GPU can make use of 16 GB. Any GPU can always make use of more VRAM. This was the case for GTX 770. Look how performant 4 GB 770 is and how worse and bad 2 GB 770 looks. Do you think Nvidia really believed that 2 GB was all 770 can make use of?

As I've said countless times, there's no game as of now that introduces HIGH quality, generation defining texture packs in their arsenal. But there will be, because some developers will want to make USE of higher VRAM GPUs. And if it turns out they truly change the landscape of a game's graphical fidelity, you will be phased out of those textures not because of your chip's power, but because of VRAM.

A rtx 3070 is able to push 4k 70 fps with rt enabled in RE:Village. Yet it can't, instead, it drops to 40s. This alone proves that 8 GB is not a good "amount" for the power level of a 3070/2080Ti. 10 GB would be better, but 12 GB would be ideal. Same is the case with 3080. Simple, it should've been 16 GB.

Yeah there's a lot of conditions attached to what is being said. Because so much of it is subjective. We talk about complex systems in simple ways all the time without really realizing it. We say things like is a card "4k capable" or does it have "acceptable performance" and there's a huge amount of subjectivity in there. So of course when deciding appropriate memory for a card you're subject to all of those problems. You also suffer from the problem I've already mentioned which is that you're typically limited to various different memory configs for your card and sometimes what you think is the sweet spot might not be available and you have to make some trade off and pick the closest approximation.

In cases of things like the GTX 1060 for example they ended up with 2 models a 3Gb and 6GB variant (which is quite rare) the 6Gb variant being more expensive. And it's likely that the "ideal" memory for such a card was probably right between those values making the decision a really difficult one and in the end just making both variants and allowing consumers to decide the trade off made the most sense. OK so the 3Gb variant ran into memory restriction problems as you'd probably expect if the ideal was something closer to lets say 4.5Gb. But then the 6Gb variant cost more, if you're budget constrained as most people are, it's not necessarily clear which is better, at worst it's a trade off.

I think I've addressed the idea of high res textures much further back in the thread. My general response to this is that the reason we have this disconnect/argument on the forum is because many of us are oldschool and saw vRAM grow with texture quality/size and still have that hangover expectation. For a very long time in gaming the paradigm was to load a "level" and that meant throwing all of the levels needed assets into vRAM and then when you finished the level you unloaded those assets and swapped for the next lot. And either the assets fit and things were fine, or they didn't you got a "cache miss" and you'd stutter as that missing texture was fetched from disk. But then around the era of the first Crysis game or maybe even a bit before that, we moved to texture streaming, the zoning of open worlds and having textures just stream in and out as necessary, and also a more aggressive system of LOD as we moved towards truly open world games. High res textures are nice but it's a total waste having them in memory at full res if they're on a sign post on a road 1 mile ahead of you and appear on 4 pixels on the screen. You don't need to stuff the entire high res texture pack into vRAM, modern game engines have texture pools they dynamically stream in and out of continually as you move through the game space. the LOD takes care of that for you.

It's interesting because this concept was shown off as part of Unreal Engine 5's early access video here https://www.youtube.com/watch?v=d1ZnM7CH-v4 around 5:27 they talk about world partitioning in order to do the same kind of optimization for their new world geometry now they have these insane nanite libraries of very highly detailed models. You only keep what you need on screen in the level of detail you need it, most of those high res models and textures only exist in vRAM as the low quality variant of the original. This kind of thing is what allowed games to both go open world and eliminate a lot of loading screens, and also allowed game install sizes, a lot of which is textures, balloon way up in size.

And the paradigm is going to change again soon, that's where I'd put my money. DirectStorage and gamers moving to PCI-e 4.0 with disks up to 8GB/sec sequential reads, parallel GPU decompression of textures avoiding the bottleneck of the CPU, that's going to fundamentally change how games work and load assets. Streaming of assets is going to be so insanely rapid that I think traditional loads screens will basically go away entirely, in game zone traditions will be a few seconds rather than up to a minute. LODs will be able to be a lot more aggressive because we'll be able to sustain streaming in/out the higher detailed stuff far more rapidly.
 
I think I've addressed the idea of high res textures much further back in the thread. My general response to this is that the reason we have this disconnect/argument on the forum is because many of us are oldschool and saw vRAM grow with texture quality/size and still have that hangover expectation. For a very long time in gaming the paradigm was to load a "level" and that meant throwing all of the levels needed assets into vRAM and then when you finished the level you unloaded those assets and swapped for the next lot. And either the assets fit and things were fine, or they didn't you got a "cache miss" and you'd stutter as that missing texture was fetched from disk. But then around the era of the first Crysis game or maybe even a bit before that, we moved to texture streaming, the zoning of open worlds and having textures just stream in and out as necessary, and also a more aggressive system of LOD as we moved towards truly open world games. High res textures are nice but it's a total waste having them in memory at full res if they're on a sign post on a road 1 mile ahead of you and appear on 4 pixels on the screen. You don't need to stuff the entire high res texture pack into vRAM, modern game engines have texture pools they dynamically stream in and out of continually as you move through the game space. the LOD takes care of that for you.

Nice point on the 1060, I picked up the 6Gb at the time for that reason (of longevity - not that its was a sensible metric for PCs but I also mined ETH which doubled up).

I highlighted this part but then you elaborated the last paragraph on what I was going to point out. The DirectStorage and faster drives in general (m.2/SSD) seem to be what we didn't have back in the oldshool VRAM arguments which now make techniques, habits and dev shortcuts totally bypassing what was the norm 5-10 years back. It certainly is an interesting remit to watch - I'm hooping they don't take ages to implement it like some other features we hear about and then it is a damp squib.
 
Maybe ray traced lighting and shadows at 4K? you know the thing you've been wetting yourself about?

While everyone else wets themselves about legacy rasterisation :rolleyes: 4k RT would take a lot more GPU grunt that we have today, so not restricted by VRAM.

Guess we are at around 8GB since the 3070 tanks in some games at 4K. It might "explode" if we see some more games with real ray tracing and no crippling memory optimizations that make people disappear in front of you.

Does that happen in all RT games or are you referring to the scripting in CP2077? I'm sure that was done due to the lack of system RAM on consoles as NPC instancing would be CPU scripted.

GPUS can run out of grunt for various different reasons. I hope you've taken care of that IVY bridge system before you waste any more money on next next gen GPUs.;)

As I said originally, why wouldn't 10GB VRAM be enough, forgetting about GPU grunt? BTW. Still using the 3770k today. Was £720 really a waste of money on a 3080?

Sadly Nvidia also placed console like budgets/limitations on their new GPUs by choosing to match the vram on their flagship with the vram on the new consoles. Even though you've tried to post about it as some kind of positive spin as to why games won't use more than 10GB. (If you want to be limited to budget console settings on your 3080 then it sure is a positive:cry:)

I really would have thought someone like yourself would be delighted for devs to move on from these LEGACY games and make a true NEXT GEN experience with no LIMITATIONS :cool:

Don't worry though DLSS will save you until Nvidia decide it's time to upgrade and the performance benefits start to decrease.:(

I've already said that I'm happy to buy any vendors card as long as it does what I'm looking for. This fanyboy nonesense doesn't make any sense, especially when i've already said I bought a new card for RT and no one so far is denying AMD's cards are way behind in this feature, never mind DLSS.

Again no valid reason why games would suddenly require more than 10GB of VRAM :rolleyes:
 
Different folks different stokes, I like pretty graphics more than performance and that's partially because these days I play all my games with a controller and extra frames above 60fps make little difference at all to the experience unless using a mouse so I rather put the gpu grunt into something more tangible for me like graphics.

so for reference, I now play most games with the frame capped to 60fps, TV set to 120hz refresh and Gsync enabled - that for me is the best balance, I don't "feel" any improvement from going over 60fps with the above settings so I don't turn down settings to say run at 80fps or 90fps
Even when playing on a controller (which I do in some games) you get higher clarity in motion with higher FPS so to me it’s like playing at higher resolution.
And yes that applies to oled screens as well which is my preferred gaming screen.
But as you said different folks different stokes.
 
Depends on what resolution your playing at. I recall Jensen in his showcase at launch where a handful of gaming peeps were all playing 8k and milking the hype. We dont have any titles that seem to be your requested Achilles heel you seek, although my guess will be its coming. It needs to be a game that doesnt favour AMD but also isnt sponsored by nvidia so they can rig the results. One thing is for sure, if your playing at 1080p or 1440 its not going to present itself.

You touch a little on my reason for asking, resolution. I don't see a sudden explosion in resolution usage so no need for higher resolution textures, while at the same time companies realise the challenge of delivering larger packages. With the GPU grunt we have today, 3090, it's not really enough for 4k/60 RT and so DLSS is going to be used in that situation reducing the source resolution and VRAM requirements.
 
I don’t think I play any game at max settings on my 3080 as to me more frames is more noticeable than slight increase in graphics quality.
I don’t mind dropping texture detail by one notch if needed.
Again I would rather have more performance than more vram.
i've never seen a real example where turning down textures "one notch" never, ever, ever helped any vram starved card...

Even going LOWEST OF LOWEST possible textures on RE:village does not help RTX 3070 from dropping frames at 4K with RT enabled

Texture settings are mostly placebo in modern games
 
With the GPU grunt we have today, 3090, it's not really enough for 4k/60 RT and so DLSS is going to be used in that situation reducing the source resolution and VRAM requirements.

I will pass this on to nvidia really as they started this with the 2080Ti from memory and we are what nearly three years on from that release. As I said, didnt Jensen bring up 8k gaming at the soapbox, this is surely where the 3080 will tank (not that we need 8k gaming but you asked "what do you see using using VRAM in next 1-2 years").

I think your coming across as justifying why the 3080 was the better buy than the 3090 but that's subjective.
 
Again no valid reason why games would suddenly require more than 10GB of VRAM :rolleyes:

But but but..... godfall!!!!!! :cry: And yup correct on the cyberpunk NPC bit, nothing to do with vram..... Upgrading to a 5600x allowed me to whack npc crowd density from medium to high, not upgrading from a vega 56 to a 3080....




Speaking of godfall, had a look at some videos earlier and FPS looked fine to me unless are they not using the correct settings??? Or is this another case where one random guy or one review site had weird issues/results and certain individuals have taken this as gospel that 10gb vram isn't enough..... ;) :p




Will be interesting to see how much vram dying light 2 needs....
 
I will pass this on to nvidia really as they started this with the 2080Ti from memory and we are what nearly three years on from that release. As I said, didnt Jensen bring up 8k gaming at the soapbox, this is surely where the 3080 will tank (not that we need 8k gaming but you asked "what do you see using using VRAM in next 1-2 years").

I think your coming across as justifying why the 3080 was the better buy than the 3090 but that's subjective.

No, I was just emphasising the need for GPU grunt over VRAM. I think Jensen's 8k was an attempt to sell DLSS, otherwise he must have been inhailing when speaking to Musk.
 
No, I was just emphasising the need for GPU grunt over VRAM. I think Jensen's 8k was an attempt to sell DLSS, otherwise he must have been inhailing when speaking to Musk.

I agree. I just remember he made a whole segment about it yet going back to 4k we still have no hardware that can blitz RT there, so he seems to be hoodwinking us in some way. Having lots of VRAM is handy if you are not just gaming, depends on your uses.
 
But but but..... godfall!!!!!! :cry: And yup correct on the cyberpunk NPC bit, nothing to do with vram..... Upgrading to a 5600x allowed me to whack npc crowd density from medium to high, not upgrading from a vega 56 to a 3080....




Speaking of godfall, had a look at some videos earlier and FPS looked fine to me unless are they not using the correct settings??? Or is this another case where one random guy or one review site had weird issues/results and certain individuals have taken this as gospel that 10gb vram isn't enough..... ;) :p




Will be interesting to see how much vram dying light 2 needs....
Found that game in one of the PS5 consoles I was repairing so gave it a go. It was cool for the first 5 minutes but turned out to be the most boring game of 2021.
Honestly no one should care about this game.
 
But but but..... godfall!!!!!! :cry: And yup correct on the cyberpunk NPC bit, nothing to do with vram..... Upgrading to a 5600x allowed me to whack npc crowd density from medium to high, not upgrading from a vega 56 to a 3080....




Speaking of godfall, had a look at some videos earlier and FPS looked fine to me unless are they not using the correct settings??? Or is this another case where one random guy or one review site had weird issues/results and certain individuals have taken this as gospel that 10gb vram isn't enough..... ;) :p




Will be interesting to see how much vram dying light 2 needs....
Look at the performance hit the 3070/3080 takes with RT on at 1080P Vs 1440P and then 4K in comparison to the 3090 and the 16GB 6000 series. Minimums take a big hit in comparison to GPUs with more memory. Better prep those excuses “but muh Godfall”. :D
 
Look at the performance hit the 3070/3080 takes with RT on at 1080P Vs 1440P and then 4K in comparison to the 3090 and the 16GB 6000 series. Minimums take a big hit in comparison to GPUs with more memory. Better prep those excuses “but muh Godfall”. :D

So why is the 16GB 6900XT so far behind the 10GB 3080 in Cyberpunk, so much so that it should have you turning green :D
 
So why is the 16GB 6900XT so far behind the 10GB 3080 in Cyberpunk, so much so that it should have you turning green :D

Cyberpunk's textures look mostly mediocre. This, no one can deny. Its not a proper "benchmark" for future games.

On top of having mediocre textres, game has critical LOD issues. Constant texture streaming can be seen, and it makes the game ugly in certain circumstances.

The game literally tries to load textures as you turn the camera. It doesn't try to store more texture data, since it caters to the Nvidia's most used GPUs (2070s, 2080s, 3070s)

https://www.youtube.com/watch?v=NT6denvILfo

https://www.youtube.com/watch?v=0Al9YV3kvPs

https://www.youtube.com/watch?v=Mf1UGZxgBOE

It is clear that game engine pulls all kinds of strings to "tame" VRAM usage. This is not a good way of handling the issue.
 
So why is the 16GB 6900XT so far behind the 10GB 3080 in Cyberpunk, so much so that it should have you turning green :D
That has nothing to do with my post above whatsoever, but they look pretty even to me?
Guess you needs to crank up RT/DLSS for the 3080 to be faster.

Speaking of which, fancy comparing 6900 XT Vs 3080 performance in the Timespy, Firestrike, Forza, Shadow of the Tomb Raider, and Valhalla OcuK bench threads?

Would love to see how fast your 3080 is in comparison at the same settings - for a bit of fun. ;)
 
That has nothing to do with my post above whatsoever, but they look pretty even to me?
Guess you needs to crank up RT/DLSS for the 3080 to be faster.

Speaking of which, fancy comparing 6900 XT Vs 3080 performance in the Timespy, Firestrike, Forza, Shadow of the Tomb Raider, and Valhalla OcuK bench threads?

Would love to see how fast your 3080 is in comparison at the same settings - for a bit of fun. ;)

Yup. Especially after seeing how higher res solves the grain issue of SSR, RT reflections lost their biggest selling point for me (being able to get rid of reflection noise). Apparently pushing 1.5x res scale at 1080p or 1.25x at 1440p is enough to clear SSR grain mostly... And in that aspect, raster monster GPUs (RDNA2) should have an easy time pushing high FPS.

The raster performance of 3070 in Cyberpunk is really startling, I feel like Series X can push more frames/res than the 3070 actually. It feels like Ampere is anemic when it comes to raster performance most of the time
 
Found that game in one of the PS5 consoles I was repairing so gave it a go. It was cool for the first 5 minutes but turned out to be the most boring game of 2021.
Honestly no one should care about this game.
Only person that seems to care about that game is LtMatt. Every time the 10gb is enough up until today topic comes up… :p


@TNA @Nexus18 Just a heads up as I'am also looking forward to Dying Light 2. £32.99 from cdkeys.com for the Steam key pre-order.
Got mine from there ages ago. I am set :D
 
Status
Not open for further replies.
Back
Top Bottom