• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I am starting to think you are opethdisciple's alt account. Going around in circles about the VRAM issue :p

Also, remember, AMD are working on Super Resolution which will be kind of like DLSS. As I understand it, it will work on a lot more titles than DLSS, but might not be quite as good, at least not initally. This is why it is best for you to wait for AMD's reviews before making a choice. That said if you are opethdisiple then we all know you will not be going AMD :p:D

There is only one instance of me. :p
 
Well that's a load of balls, Im running it at 4k with Balanced DLSS and Ultra RT and its not even close to saturating the 10gb comes in at around 8gb, Nice try, but then again, an agenda is always an agenda.

can you post a screen shot showing that please.
 
Guys I know its allocation but that 9.5gb-10gb being allocated at 4k is kind of worrying lol.

I just have one question, do cross gen titles define the requirements for next gen? Or do I need to wait for true next gen games for that?

Like if watch dogs legions is allocating that much, that means that is what I can expect of next gen games in terms of vram usage?

I know the usage will be very different, just want to know if the next gen games can also be expected to use similar 8gb-10gb vram at 4k. I mean sure there can be 1 or 2 outliers that will use 10gb+ i mean in general...

Right now all i see for 4k is 6gb-8gb for current gen games.
And 4gb-6gb for 1440p for current gen games.
 
Guys I know its allocation but that 9.5gb-10gb being allocated at 4k is kind of worrying lol.

I just have one question, do cross gen titles define the requirements for next gen? Or do I need to wait for true next gen games for that?

Like if watch dogs legions is allocating that much, that means that is what I can expect of next gen games in terms of vram usage?

I know the usage will be very different, just want to know if the next gen games can also be expected to use similar 8gb-10gb vram at 4k. I mean sure there can be 1 or 2 outliers that will use 10gb+ i mean in general...

Right now all i see for 4k is 6gb-8gb for current gen games.
And 4gb-6gb for 1440p for current gen games.
Nobody can tell you for sure. As has been stated numerous times in this thread, there are so many variables in play that determine VRAM usage from one frame to the next, let alone for an entire process, that predictions for future usage can't really be based on more than trend data. This is where future-proofing comes in, arguably, as a valid strategy: VRAM usage is probably not going to go down, so you're doing nothing bad by buying a card that has as much as you can afford. There's literally no downside to it, aside from the obvious extra cost involved.

Think of it this way: in 3 years time, will you regret buying a 16GB card? Almost certainly not. Will you regret buying a 10GB card? Possibly. I doubt anyone that buys every single generation of card actually cares as they simply upgrade out of the problem, but for the majority of gamers that keep their hardware for multiple generations, it's an important consideration.
 
I find this topic confusing. The 3080 with it's 10gb gddr6x has a memory bandwidth of 760GB/s. The 6900xt has from what specs I can find 16gb Gddr6 @ 512 GB/s. Are people saying that lower bandwidth is more future proof because it has more Gb's?
 
I find this topic confusing. The 3080 with it's 10gb gddr6x has a memory bandwidth of 760GB/s. The 6900xt has from what specs I can find 16gb Gddr6 @ 512 GB/s. Are people saying that lower bandwidth is more future proof because it has more Gb's?
Once memory is full it's full it does not matter how fast it is
 
Unfortunately they do not list their methodology for measuring vRAM so we don't know if they're measuring simply what is allocated or rather what is in use. The screenshots in the original article show MSIs OSD for memory and if they've not renamed the OSD labels then it's just measuring whatever memory is allocated.

Again this ability to measure vRAM actually in use is extremely new, it's not in the full release of Afterburner yet it's only in a specific build of the beta and was only released recently so it's unlikely at this stage that any reviewer or hardware site testing vRAM usage are going to be telling you what is actually needed to run the game, but rather what is allocated.

I posed this question on their Q&A form, so we'll see if we get a response.


Texture Resolution: This one is pretty simple. It controls texture detail on various elements in the scene. This setting gobbles up your GPU’s dedicated VRAM. For Control, a 4GB card is sufficient for ultra at 1080p. For 1440p and 2160p, 6GB cards should do, although I suggest going with an 8GB one for the latter, especially if you are not using DLSS (which is highly recommended with ray-tracing).

Shadow Resolution: This affects the level of detail of shadows in the scene. Like texture resolution, this affects the VRAM usage. I suggest sticking to ultra unless you have less than 4GB/6GB video memory are playing at 1080p/4K.
https://www.techquila.co.in/control-pc-performance-analysis-nvidia-rtx-on/

https://www.kitguru.net/components/...s/nvidia-rtx-3080-founders-edition-review/24/

According to Nvidia, it’s a balancing act between cost and performance. Justin Walker, product manager for desktop GeForce GPUs, said the following: ‘We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples – if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.’

https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/
Some people in this thread want the vRAM used by the game only, not allocated vRAM.
So...
How to change Afterburner to show "GPU Dedicated Memory Usage \ Process."
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-46301972

Warzone:
1440p: 6.3GB
2160p: 5.9GB???

AC Odyssey:
1440p - 4.9GB
2160p - 5.7GB

Control:
DLSS (1440p with an internal render of 960p) - 5GB
1440p - 5.8GB
2160p - 6.6GB

Persona 4 Golden (Just to see how accurate this is)
1440p - 450MB

All tested with max settings.

I guess that's good to know? I wonder how much next gen games will use then
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-46310192

At 2160p:
Detroit Become Human - 4.7GB
The Witcher 3 + HD Reworked Mod - 5.5GB
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-48224942


Control Ultimate Edition
Max Settings, Quality DLSS 3440x1440
4164MB VRAM
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-49113427

Age of Empires III Definitive Edition
Max Settings, First 2 are 3440x1440, 3rd is 6880x2880.

You can see how misleading VRAM stats can be, I have total system allocation of 7811 MB in the 2nd picture, yet only 1680MB of it is from AoE 3. When I took this screenshot, I had massive amounts of chrome tabs open, and had not reset my computer in quite some time. And yet if someone did not use the per-process #, they would be misleading people how much VRAM AoE 3 was using.

This is also a good example of the effect resolution has. VRAM usage has almost doubled between screens 2 and 3, when using 4x the pixels.
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-49113670

Death Stranding
Max Settings, 3440x1440 DLSS Quality.

I've included the 1st screenshot to show you why you should not trust the in-game bars. It says my settings will use 4.1GB. RTSS reports 6684MB of system allocation, but the truth is only 1594 MB is being used by Death Stranding.

The 2nd screenshot looks weird because I was playing in HDR. You can see in game, that we are only using 3764MB of VRAM, despite using HDR, and max settings. DLSS reduces VRAM usage, much as DSR increases it. And yet all previous VRAM measurements would have reported 8665MB.... is anyone seeing the pattern here?
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-49113769

The Witcher 3 with HD reworked texture mod/Tweaks mod and hairworks is off, DSR'ed 8k (8192x4320) as I'm too lazy to change the edid stuff. Using a 3080.

vram usage 7579 MB, vram allocated 9923 MB
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-49115275


Ghostrunner 3440x1440, RTX On, Max Settings

DLSS Quality, Performance, and Off, respectively.

Please, take notice of how VRAM scales with DLSS.

4.8GB - Performance
5.3GB - Quality
6.2GB - Native
https://www.resetera.com/threads/ms...display-per-process-vram.291986/post-49624399

Have fun with this rabbit hole.
 
Last edited:
Nobody can tell you for sure. As has been stated numerous times in this thread, there are so many variables in play that determine VRAM usage from one frame to the next, let alone for an entire process, that predictions for future usage can't really be based on more than trend data. This is where future-proofing comes in, arguably, as a valid strategy: VRAM usage is probably not going to go down, so you're doing nothing bad by buying a card that has as much as you can afford. There's literally no downside to it, aside from the obvious extra cost involved.

Think of it this way: in 3 years time, will you regret buying a 16GB card? Almost certainly not. Will you regret buying a 10GB card? Possibly. I doubt anyone that buys every single generation of card actually cares as they simply upgrade out of the problem, but for the majority of gamers that keep their hardware for multiple generations, it's an important consideration.
Fair points I suppose, But solely for DLSS I really want to go nvidia... I'm hoping at 1440p 10gb won't cause me a lot of issues except a select few games...
 
I find this topic confusing. The 3080 with it's 10gb gddr6x has a memory bandwidth of 760GB/s. The 6900xt has from what specs I can find 16gb Gddr6 @ 512 GB/s. Are people saying that lower bandwidth is more future proof because it has more Gb's?
That's why the infinity cache is there
 
Fair points I suppose, But solely for DLSS I really want to go nvidia... I'm hoping at 1440p 10gb won't cause me a lot of issues except a select few games...
Thing is, DLSS is already in a select few games. Nvidia need to really improve that. I am lucky that it is in all three of my most anticipated games. But I know it won’t be in many. Like for example Resident Evil 8 likely won’t have it.
 
I find this topic confusing. The 3080 with it's 10gb gddr6x has a memory bandwidth of 760GB/s. The 6900xt has from what specs I can find 16gb Gddr6 @ 512 GB/s. Are people saying that lower bandwidth is more future proof because it has more Gb's?
The 6900xt doesn't need the same amount of bandwidth as the 3080, because of the infinity cache.
 
3080TI 20GB pretty much confirmed at this point! There's was a tweet linked to at Reddit, that's since removed. Hopefully on TSMC 7nm.

Going to be fun quoting a few select people here who were adamant that 10GB 3080 is enough for 4k for years, and who claimed Nvidia wouldn't release a 20GB 3080/3080ti variant.... Wake up, sheep!
 
Status
Not open for further replies.
Back
Top Bottom