• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Looking at analysis of actual vRAM breakdown in many games today shows that's actually not true. Many games have texture pools which are of fixed size and textures swap in and out of, you can typically measure the size of the texture pool through the engine variables like with Doom and see that memory usage is around say 2Gb for textures where as the game takes up to 8-10Gb of vRAM maxed out with raytracing. So it's about 1/4 to 1/5th, and the same is true in many other games. You can often see the memory breakdown in the graphics settings and see this really isn't true.

This is a holdover from the past where games couldn't texture stream and so vRAM limited assets in the world. Once texture streaming became popular vRAM stopped growing with texture size and start growing with other effects that need buffers in memory and also load on the GPU. The idea that you need more than 10Gb to not have blurry textures is kinda crazy, there's no evidence for that at all. High res textures are nice but people cannot tell the difference between UHD textures and something lower quality on a surface that's not right in front of their face, so most of the time the high res textures are just flushed out of the memory pool and lower quality variants are used, and the engine just streams those textures in/out of memory without a problem.

How are you able to look at the memory allocation map for Doom and many of today's games?
Do you have the source code or are one of the original programmers?
The source code for these games is secret property, they won't let you run it through a debugger or memory profiler.
You have some sort of hacking tool that can profile other peoples game programs?
 
Last edited:
After playing only on 4k monitor 1080p looks blurry to me!
I have been saying that for around 7 years now. But people kept saying they could not see the difference. Lol. There is a massive difference. If one cannot see it they either play very far away from their monitor or they need to go specsavers.
 
Well my eyes are getting worse so I pre-ordered the LG 32GP850-B, a 32" 1440p panel. I think my 3080 will drive that longer than I'll keep the card. I'm still not convinced that 4k is worth it as even the 3090 struggles with RT at that resolution.
 
Well my eyes are getting worse so I pre-ordered the LG 32GP850-B, a 32" 1440p panel. I think my 3080 will drive that longer than I'll keep the card. I'm still not convinced that 4k is worth it as even the 3090 struggles with RT at that resolution.
Each to their own. We all have preferences. 1440p could be considered the sweet spot still. But as long as I am getting 40-60fps (depending on the game) then I would rather take the image quality. 120fps is nice and all, but not at the sacrifice of superior image quality for me.
 
What I mean is if you're buying a next gen card get as much Vram as possible so it lasts a few years future games are only going to use more 4k and 2k textures.
Example, Watch dogs legion uses about 9GB on max 4k screen mode, games *WILL* use more texture and geometry going forward.
NVIDIA/AMD are not putting 12/16GB on there for no reason, software houses *WILL* use this so plan properly for the next few years. 10GB is acceptable but better to
get more if you're willing to pay the price of a 3080.
I'm talking about the next few years of game releases.
Texture streaming is slow and can cause "popping" artifacts, going over the pcie bus to system memory is slow, if it worked so well they wouldn't put large amounts of Vram on newer GPUs.
After playing only on 4k monitor 1080p looks blurry to me! :)
When use render a 2k or 4k texture map on a 1080p screen a *LOT* of the original pixels are lost so the the detail is gone!


4k8k, is that you?
 
Each to their own. We all have preferences. 1440p could be considered the sweet spot still. But as long as I am getting 40-60fps (depending on the game) then I would rather take the image quality. 120fps is nice and all, but not at the sacrifice of superior image quality for me.

Agree.

When I had my GTX 1080 I was getting around 40 - 45 fps at 1440p in AC Odyssey but with Gsync it was still a perfectly acceptable playing experience and it was at max IQ.
 
10GB is acceptable but better to get more if you're willing to pay the price of a 3080.
I'm talking about the next few years of game releases.
If you are persistent you can get a 3080 FE for £650 - I'm not sure there is anything comparable in that price range tbh. And as for the next few years well stuff gets more demanding anyway and maybe I will have to turn down settings for VRAM reasons only on a couple of games? Games are already demanding for my 3080 at 4K!

I'll be looking at new cards when they arrive for their relative GPU grunt and not the VRAM :)
 
Next gen hasn't really arrived any way.

Microsoft kind of lead us on to believe that direct storage was going to come early 2021 (that's what they said in 2020).

Direct Storage will be a thing with Windows 11 which again sounds like it wont get mass adoption until 2022 (when they will enable Win 10 to Win 11 upgrades) by which point we will be ever closer to the next generation of GPUs anyway.
 
Direct Storage will be in Windows 11 at launch, it's driven by something called the I/O Ring API. But it won't do anything until a developer programme's their game to call the API for disk reads
 
Each to their own. We all have preferences. 1440p could be considered the sweet spot still. But as long as I am getting 40-60fps (depending on the game) then I would rather take the image quality. 120fps is nice and all, but not at the sacrifice of superior image quality for me.

Don't get me wrong. I'm not chasing crazy FPS. It's just at 1440p everything runs much cooler and quieter. There is no need to overclock. I enjoyed CP2077 maxed out at 1440p with 45-capped 60 FPS using this ancient 3770k. The other problem is at 27" 4k the desktop appears just too small fom me to read.
 
Don't get me wrong. I'm not chasing crazy FPS. It's just at 1440p everything runs much cooler and quieter. There is no need to overclock. I enjoyed CP2077 maxed out at 1440p with 45-capped 60 FPS using this ancient 3770k. The other problem is at 4k the desktop is just too small to read.

4k tvs look greatly worse below their native 4k..
 
How are you able to look at the memory allocation map for Doom and many of today's games?
Do you have the source code or are one of the original programmers?
The source code for these games is secret property, they won't let you run it through a debugger or memory profiler.
You have some sort of hacking tool that can profile other peoples game programs?


On a 3090, 9-10GB are allocated and 8-9GB is dedicated. Specs stated 11GB but you can reduce the texture pool. RTX 3080 should be on the edge at 4k.

 
How are you able to look at the memory allocation map for Doom and many of today's games?

Quite simply because there's an engine variable that controls the size of the memory texture pool size and this is set via the graphical settings menu when setting texture size. You can access this variable through the games console and if you type "is_poolsize" and hit tab it auto-completes to the size it's set as right now, so you can map what the game menu settings use to a specific size in MB.

But as I said many other games will literally tell you in the graphics menu how much vRAM each specific setting is using, the last resident evil game did this, CoD Cold war did it, Doom Eternal does it, etc.

The dumb thing is that we went over this on page 7 of (the 8GB thread) looking at memory usage of Doom eternal in depth, i posted loads of real world results based on testing back then when people were claiming Doom Eternal was super hard on vRAM and all those claims turned out to be false. So 200+ pages later we're just going over the same nonsense again.

*edit*

I talked about Doom in both the 8GB and 10Gb thread on page 7 of the 8GB thread but not sure where in the 10GB. Either way the claims of vRAM usage are mostly bogus.
 
Last edited:
No issues at all here with a 3080 everything maxed in doom eternal, even with dlss off, it runs incredibly well (@3440x1440 and 4k), game uses all the 10gb vram according to msi AB but there are no drops/slowdowns/stutters.


On the topic of "4k", I'm only really starting to "really" see the benefits of 4k over the last 2 years or so now especially in games like rdr 2 and days gone.
 
Status
Not open for further replies.
Back
Top Bottom