• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Not even Port Royal. The leaked benchmarks showed Port Royal. The 3080 is fine at 4k. DLSS has that covered.

VRAM Usage
You can see in my testing that having DLSS 2.0 enabled cuts down on the amount of VRAM used at 8K but a big chunk. Looking at the GeForce RTX 2080 Ti which uses over 10GB of VRAM at 8K without DLSS 2.0 enabled, but under 8GB of VRAM when DLSS 2.0 is enabled to the Quality preset. https://www.tweaktown.com/articles/...hmarked-at-8k-dlss-gpu-cheat-codes/index.html

VRAM Usage
Something that has come up with other games tested in 8K is VRAM usage. For some, astronomical numbers that consume the 24 GB memory of an TITAN RTX are not unheard of. Once again, DLSS 2.0 triumphs on the RTX 2080 Ti. Using it in quality mode lowered VRAM consumption by nearly 2 GB. After enabling it, usage went from 10.3 GB down to 8.4 GB, well within the limits of the card’s 11 GB VRAM. By not hammering the card’s VRAM limits, DLSS 2.0 has allowed it to perform to its fullest potential. In the end, Mr. Garreffa has likened DLSS 2.0 to GPU cheat codes, and we can now easily see why. https://www.thefpsreview.com/2020/0...8k-with-dlss-2-0-using-a-geforce-rtx-2080-ti/

The issue with 8k, is games like Control are reported to use 20GB of vRAM. This means that the 6900XT with 16GB of vRAM may have issues keeping up. https://www.eurogamer.net/articles/digitalfoundry-2020-dlss-ultra-performance-analysis



More Death Stranding

Graphics memory (VRAM) usage

How much graphics memory does the game utilize versus your monitor resolution with different graphics cards and respective VRAM sizes? Well, let's have a look at the chart below compared to the three main tested resolutions. The listed MBs used in the chart are the measured utilized graphics memory during our testing. Keep in mind; these are never absolute values. Graphics memory usage can fluctuate per game scene and activity in games. This game will consume graphics memory once you start to move around in-game, memory utilization is dynamic and can change at any time. Often the denser and more complex a scene is (entering a scene with lots of buildings or vegetation, for example) results in higher utilization. With your close to the max "High" quality settings this game tries to stay at a five towards 5~6 GB threshold. We noticed that 4GB cards can have difficulties running the game at our settings. Especially the Radeons.

index.php


Control if you turn on everything and 8k you get 19GB vRAM usage.



9131_50_control-tested-8k-nvidia-titan-rtx-vs-amd-radeon-vii-showdown.png

https://www.tweaktown.com/articles/9131/control-tested-8k-nvidia-titan-rtx-uses-18gb-vram/index.html

List of games at different resolutions including 4k. None going near 10GB.
https://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html

We do not know how things will change with the next gen console generation. Thats last gen. We are literally on the cusp of a new era for console gaming. They have double the memory compared to last generation.
 
Well I'm reading reports from 3080 owners in Watch dogs legion 4k which came bundled with their graphis card. The game hits the 10gb limit and starts stuttering and Legion is still a last gen game :eek:
 
Lets say that's true for one min. 10GB of GDDR6x is the equivalent of having 13GB GDDR6 in terms of bandwidth.
That's not how it works.

The issue is some games have that little bar at the top that let you set IQ settings which they calculate wont exceed your vram limitations.
Those are useless most of the time and not accurate in my experience. Ever seen the one in Resident Evil games? I remember it telling me my 12gb was not enough on Titan XP when it never used more than 8gb. And even then I do not know how much of that was cache and what was actually needed before fps tanks.
 
Well I'm reading reports from 3080 owners in Watch dogs legion 4k which came bundled with their graphis card. The game hits the 10gb limit and starts stuttering and Legion is still a last gen game :eek:

Typical Ubisoft game but people will want to jump on 10gb is not enough but no game ready driver and a big patch incom8ng tomorow to fix the issues


There is supposed to be a performance patch coming to PC on 30/10/2020 so a new Nvidia driver will most likely come late today or tomorrow
Watch Dogs Legion October 30th Update Release Notes
  • Made several improvements to PC performance
  • Optimized performance to improve framerates for RTX GPUs
  • Fixed an issue that caused?framerate?drops to occur when characters walk through checkpoints
  • Improved the framerate performance when driving
  • Fixed a freeze when switching input controllers on PC
  • Fixed an issue that could cause the game to crash when launched?on a console set?to TURKISH, INDONESIAN,?GREEK, ROMANIAN, HUNGARIAN, VIETNAMESE or THAI?languages
 
That's not how it works.


Those are useless most of the time and not accurate in my experience. Ever seen the one in Resident Evil games? I remember it telling me my 12gb was not enough on Titan XP when it never used more than 8gb. And even then I do not know how much of that was cache and what was actually needed before fps tanks.

It's an issue when the game stops you adjusting the settings based on that little vram bar.

Even if it makes no difference if the game stops you setting higher settings (as some games do) that that could pose an issue.
 
Last edited:
It's not just brand loyalty, I have a g-sync enabled monitor and compatible TV, what choice do I have?

It's not as cut and dry as you think, there are 'matured' feature sets that the Nvidia offer on their GPU's that also come into play, DLSS is an important one for me hitting as I want to hit 4K with more performance than native.
What choice? Bet you could sell it and get an AMD Freesync monitor for cheaper.

Still this is the Apple trap all over again, people pay more to be locked in and then get the angsty feels about tying other stuff when they could get just as good for cheaper. DLSS is a benefit but it's not in all games anyway and the AMD cards run 4K very well anyway, with Freesync combined it'd be a bit weird to suggest you're going to have such a bad experience that there's no choice. I'm not going to tell anyone what to do but people should be more open minded or not get themselves in these traps and consider AMD as we need good competition. In the GPU sector but the herd types need to get moving or regardless of the quality of the competition we'll continue to see Nvidia holding back on Vram, high prices and messy power draw. That's partly AMD's fault too as there wasn't perfect competition before but when they do good it's worth rewarding them. Personally speaking I'd be much happier knowing I have more VRAM just in case it's needed than worrying it could be a bottle neck in no time. Glald I got the LG CX so I could keep my options open instead of being tied in for the sake of it.
 
It's an issue when the game stops you adjusting the settings based on that little vram bar.

Even if it makes no difference if the game stops you setting higher settings (as some games do) that that could pose an issue.
I doubt this will be much of an issue to be honest. There is also ways around it if it happens.
 
Well I'm reading reports from 3080 owners in Watch dogs legion 4k which came bundled with their graphis card. The game hits the 10gb limit and starts stuttering and Legion is still a last gen game :eek:
Well that's a load of balls, Im running it at 4k with Balanced DLSS and Ultra RT and its not even close to saturating the 10gb comes in at around 8gb, Nice try, but then again, an agenda is always an agenda.
 
What choice? Bet you could sell it and get an AMD Freesync monitor for cheaper.

Still this is the Apple trap all over again, people pay more to be locked in and then get the angsty feels about tying other stuff when they could get just as good for cheaper. DLSS is a benefit but it's not in all games anyway and the AMD cards run 4K very well anyway, with Freesync combined it'd be a bit weird to suggest you're going to have such a bad experience that there's no choice. I'm not going to tell anyone what to do but people should be more open minded or not get themselves in these traps and consider AMD as we need good competition. In the GPU sector but the herd types need to get moving or regardless of the quality of the competition we'll continue to see Nvidia holding back on Vram, high prices and messy power draw. That's partly AMD's fault too as there wasn't perfect competition before but when they do good it's worth rewarding them. Personally speaking I'd be much happier knowing I have more VRAM just in case it's needed than worrying it could be a bottle neck in no time. Glald I got the LG CX so I could keep my options open instead of being tied in for the sake of it.

sigh, ok, believe it or not there are actually people out there that exist that actually prefer apple phones, they actually like the eco system, the OS etc, they are not all trapped little lambs in a pen, is that not even a possibility in your mind? I like g-sync, right now it is the best option for synced frames, AMD were way behind on this and even now it's still not up to scratch, I bought into g-sync because the competition at the time didn't offer anything better, now they have caught up people cry "it's a trap" If AMD where first to the market with these features then the shoe would be on the other foot.

You enjoy your extra unused ram mate.
 
Last edited:
Not even Port Royal. The leaked benchmarks showed Port Royal. The 3080 is fine at 4k. DLSS has that covered.

VRAM Usage
You can see in my testing that having DLSS 2.0 enabled cuts down on the amount of VRAM used at 8K but a big chunk. Looking at the GeForce RTX 2080 Ti which uses over 10GB of VRAM at 8K without DLSS 2.0 enabled, but under 8GB of VRAM when DLSS 2.0 is enabled to the Quality preset. https://www.tweaktown.com/articles/...hmarked-at-8k-dlss-gpu-cheat-codes/index.html

Unfortunately they do not list their methodology for measuring vRAM so we don't know if they're measuring simply what is allocated or rather what is in use. The screenshots in the original article show MSIs OSD for memory and if they've not renamed the OSD labels then it's just measuring whatever memory is allocated.

Again this ability to measure vRAM actually in use is extremely new, it's not in the full release of Afterburner yet it's only in a specific build of the beta and was only released recently so it's unlikely at this stage that any reviewer or hardware site testing vRAM usage are going to be telling you what is actually needed to run the game, but rather what is allocated.

I posed this question on their Q&A form, so we'll see if we get a response.
 
Not even Port Royal. The leaked benchmarks showed Port Royal. The 3080 is fine at 4k. DLSS has that covered.

VRAM Usage
You can see in my testing that having DLSS 2.0 enabled cuts down on the amount of VRAM used at 8K but a big chunk. Looking at the GeForce RTX 2080 Ti which uses over 10GB of VRAM at 8K without DLSS 2.0 enabled, but under 8GB of VRAM when DLSS 2.0 is enabled to the Quality preset. https://www.tweaktown.com/articles/...hmarked-at-8k-dlss-gpu-cheat-codes/index.html

VRAM Usage
Something that has come up with other games tested in 8K is VRAM usage. For some, astronomical numbers that consume the 24 GB memory of an TITAN RTX are not unheard of. Once again, DLSS 2.0 triumphs on the RTX 2080 Ti. Using it in quality mode lowered VRAM consumption by nearly 2 GB. After enabling it, usage went from 10.3 GB down to 8.4 GB, well within the limits of the card’s 11 GB VRAM. By not hammering the card’s VRAM limits, DLSS 2.0 has allowed it to perform to its fullest potential. In the end, Mr. Garreffa has likened DLSS 2.0 to GPU cheat codes, and we can now easily see why. https://www.thefpsreview.com/2020/0...8k-with-dlss-2-0-using-a-geforce-rtx-2080-ti/

The issue with 8k, is games like Control are reported to use 20GB of vRAM. This means that the 6900XT with 16GB of vRAM may have issues keeping up. https://www.eurogamer.net/articles/digitalfoundry-2020-dlss-ultra-performance-analysis



More Death Stranding

Graphics memory (VRAM) usage

How much graphics memory does the game utilize versus your monitor resolution with different graphics cards and respective VRAM sizes? Well, let's have a look at the chart below compared to the three main tested resolutions. The listed MBs used in the chart are the measured utilized graphics memory during our testing. Keep in mind; these are never absolute values. Graphics memory usage can fluctuate per game scene and activity in games. This game will consume graphics memory once you start to move around in-game, memory utilization is dynamic and can change at any time. Often the denser and more complex a scene is (entering a scene with lots of buildings or vegetation, for example) results in higher utilization. With your close to the max "High" quality settings this game tries to stay at a five towards 5~6 GB threshold. We noticed that 4GB cards can have difficulties running the game at our settings. Especially the Radeons.

index.php


Control if you turn on everything and 8k you get 19GB vRAM usage.



9131_50_control-tested-8k-nvidia-titan-rtx-vs-amd-radeon-vii-showdown.png

https://www.tweaktown.com/articles/9131/control-tested-8k-nvidia-titan-rtx-uses-18gb-vram/index.html

List of games at different resolutions including 4k. None going near 10GB.
https://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html
Thanks for the post, you just reassured me that 3080 10gb is going to be enough for 1440p for the forseeable 3-4 years at high/ultra, which is all im expecing.

I mean its still 2gb more than 2080 its not like they didn't increase the vram. If before march 2021 they release a 3080ti for $799 ill buy that otherwise the 3080 10gb will suffice.
 
Thanks for the post, you just reassured me that 3080 10gb is going to be enough for 1440p for the forseeable 3-4 years at high/ultra, which is all im expecing.

I mean its still 2gb more than 2080 its not like they didn't increase the vram. If before march 2021 they release a 3080ti for $799 ill buy that otherwise the 3080 10gb will suffice.
I am starting to think you are opethdisciple's alt account. Going around in circles about the VRAM issue :p

Also, remember, AMD are working on Super Resolution which will be kind of like DLSS. As I understand it, it will work on a lot more titles than DLSS, but might not be quite as good, at least not initally. This is why it is best for you to wait for AMD's reviews before making a choice. That said if you are opethdisiple then we all know you will not be going AMD :p:D
 
I am starting to think you are opethdisciple's alt account. Going around in circles about the VRAM issue :p

Also, remember, AMD are working on Super Resolution which will be kind of like DLSS. As I understand it, it will work on a lot more titles than DLSS, but might not be quite as good, at least not initally. This is why it is best for you to wait for AMD's reviews before making a choice. That said if you are opethdisiple then we all know you will not be going AMD :p:D
I am not a second account.

I was just having 2nd thoughts on the vram...

As far as i know super resolution is already in fidelityX and it upscales to a higher res on your monitor so it looks better. Like if you have 1080p monitor it will show 4k and downscale it to 1080p so stuff looks sharper.

I am going to buy by March 2021, so if amd has an answer by then I'll buy the 6800xt, if not, likely going with the 3080
 
I am not a second account.

I was just having 2nd thoughts on the vram...

As far as i know super resolution is already in fidelityX and it upscales to a higher res on your monitor so it looks better. Like if you have 1080p monitor it will show 4k and downscale it to 1080p so stuff looks sharper.

I am going to buy by March 2021, so if amd has an answer by then I'll buy the 6800xt, if not, likely going with the 3080
I was just kidding man :p

You have plenty of time to be well informed by then :D
 
I was just kidding man :p

You have plenty of time to be well informed by then :D

Oh lol

Ye I'm hoping that amd gets something similar to dlss, im in it for the fps lol even if it looks a bit worse i don't mind.

I acknowledge in terms of raw perf,power efficiency etc amd is leading right now when you use rage mode and smart access memory, only lack of dlss is holding me back.

Only time will tell what happens :p
 
We do not know how things will change with the next gen console generation. Thats last gen. We are literally on the cusp of a new era for console gaming. They have double the memory compared to last generation.

We do know the consoles have a total of 16GB of RAM. The Xbox is split sensibly 6/10GB. It's reasonable to assume 6GB will be used by OS, housekeeping and game, while 10GB is used for graphics. So unless they release a 24 or 32GB console we can assume 10GB is plenty.
 
Status
Not open for further replies.
Back
Top Bottom