• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
No issues at all here with a 3080 everything maxed in doom eternal, even with dlss off, it runs incredibly well (@3440x1440 and 4k), game uses all the 10gb vram according to msi AB but there are no drops/slowdowns/stutters.


On the topic of "4k", I'm only really starting to "really" see the benefits of 4k over the last 2 years or so now especially in games like rdr 2 and days gone.

what kinda frame rates are you getting with both DLSS on and OFF?
 
I haven't tested Doom Eternal since RT was added. There are videos showing it with 12Gb of usage but not using a tool that measures memory in use, only what has been allocated. From my own testing the delta between allocated and used for Doom Eternal was very big. There was a bunch of claims that it used more than 10GB even before RT was added and proper measurement confirmed that's not true. And so my guess without going back to test would be <10GB real usage still.

Secondly, when you max out the settings for Doom Eternal it sets the memory texture pool size from something fairly sensible like 2GB used for High, to 4.5GB in Ultra nightmare mode. Countless people looking for optimized settings in Doom doing visual comparisons, including myself, have tested this and it shows zero benefit to use Ultra nightmare over High. There's no evidence at all that additional memory is being used for anything, it just gets allocated as a static fixed block of memory and what is happening inside that pool is to my knowledge invisible to the end user. Other than to say High settings is not impeding the games use of textures as anything above it results in no benefit. So even if Ultra nightmare settings with RT exceeded 10GB which I doubt, it certainly isn't impacting visual quality to just put the texture pool size down to High which will reduce memory footprint by a whopping 2.5GB.
 
Last edited:
If you are persistent you can get a 3080 FE for £650 - I'm not sure there is anything comparable in that price range tbh. And as for the next few years well stuff gets more demanding anyway and maybe I will have to turn down settings for VRAM reasons only on a couple of games? Games are already demanding for my 3080 at 4K!

I'll be looking at new cards when they arrive for their relative GPU grunt and not the VRAM :)

3080 FE is like looking for a live dinosaur!
If you turn down settings you get blurry textures! You want to play on your 3080 with lores textures?
Textures is what's mostly in Vram, so you're turning off the high res tex pack for lores.
 
3080 FE is like looking for a live dinosaur!
If you turn down settings you get blurry textures! You want to play on your 3080 with lores textures?
Textures is what's mostly in Vram, so you're turning off the high res tex pack for lores.

Let me switch tac. Do you have any examples to back up the claim of textures being mostly what is in vRAM?
 
On a 3090, 9-10GB are allocated and 8-9GB is dedicated. Specs stated 11GB but you can reduce the texture pool. RTX 3080 should be on the edge at 4k.


That's the point I'm making 10GB just about gets it on a 3080. Get more Vram if you want to use hires tex and not have to lower game settings to lores tex. IF you're fine with lores tex in upcoming games 10GB is okay.
Every programmer I've talked to from different software houses is saying more large tex will be used going forward to provide more eye candy.
 
That's the point I'm making 10GB just about gets it on a 3080. Get more Vram if you want to use hires tex and not have to lower game settings to lores tex. IF you're fine with lores tex in upcoming games 10GB is okay.
Every programmer I've talked to from different software houses is saying more large tex will be used going forward to provide more eye candy.

Why would they suddenly increase texture resolution when most cards have 10GB or less and the consoles have ~10GB to allocate to graphics?
 
Let me switch tac. Do you have any examples to back up the claim of textures being mostly what is in vRAM?

I could dig up a memory profiler snapshot but commercial game code and resource files are NDA so I'm not allowed to show you that.
Just one 4k texture is 4096*4096 pixels! (thats huge!) with maps such as a normal map and mipmaps it gets even larger even after compression!
Geometry (triangle counts) take up a lot too but at the moment its mostly lower poly to save Vram.
Ask any programmer in the gaming industry and see what answer you get. Even an artist may give you an answer as they have a texture/poly Vram memory budget per level.
Other things like shaders use hardly any Vram.
Try it yourself get a game program code/resources from a friend/programmer and run it through a profiler to see what's taking up the most memory.
I'll see if I can get a memory map pic of something not NDA when I have time.
 
Why would they suddenly increase texture resolution when most cards have 10GB or less and the consoles have ~10GB to allocate to graphics?

Consoles are the poor mans gaming system!
They get the same game but with lores blurry tex while PC get the sharp Hi res tex!
Not anymore NVIDIA/AMD have released plenty of 12/16 GB cards, the old 8GB standard does NOT apply anymore.
 
Why would they suddenly increase texture resolution when most cards have 10GB or less and the consoles have ~10GB to allocate to graphics?

All of the research I did into this lead me to the conclusion that vRAM is much more tied to GPU speed these days. In the sense that you can push the vRAM usage up through assets and effects but at some point you run out of GPU grunt to cope. And that lead me to a prediction that the consoles while having ~10GB free for graphics related purposes would probably never use all of that. The APUs raw speed for graphics just isn't that great, approximately an equivalent of 2060 Super. And games during the 2060 era were not really using memory above about 6Gb on average, rarely 7GB (when measured properly)

And then in another thread looking at technical breakdown of Ray Tracing on the consoles comparing Watch Dogs Legion in a deep dive on PC vs console settings showed not only RT features and other graphical settings being vastly inferior on the consoles, one other thing mentioned was the consoles were not even using the high res texture pack. And I found that super interesting because of all the comments about how the consoles would lead the way with 16GB of memory pushing massively high res textures into the stratosphere of memory usage so far isn't coming true.

My bet is that the bottleneck for high res textures is screen resolution. I can't think of another good reason the console version wouldn't use the higher res variants, they have the memory for it, the higher res textures don't have that much of an impact on the GPU, at least not compared to other things that could use vRAM. And I think it's probably because variable resolutions on the consoles have AAA games like WDL running at low screen resolutions most of the time so the higher res textures simply wont show benefit. I use the example of Rainbow 6 Siege and their UHD texture pack all the time, look at the pack on the steam store and look at user reviews and it's barely positive ratio, all the negative reviews are basically saying they cant see a difference. Then you look at the steam hardware survey and only 2.44% of players are actually using 4k.

The consoles just can't do 4k in graphically demanding games so they use low res texture packs. The PC can with decent hardware and so sometimes has these higher res texture packs, and so far they all run in 10Gb just fine. The prediction that we'll have super-duper even more high res texture use in the near future doesn't make sense to me. I think we'd probably need to see a leap to 8k gaming before that would return any kind of benefit and that's not going to happen for a very long time, decades for any kind of minor adoption.
 
That's the point I'm making 10GB just about gets it on a 3080. Get more Vram if you want to use hires tex and not have to lower game settings to lores tex. IF you're fine with lores tex in upcoming games 10GB is okay.
Every programmer I've talked to from different software houses is saying more large tex will be used going forward to provide more eye candy.

At the moment 10 GB is fine, most cards are 8GB. At some point everything changes, in time vRAM requirements will rise. Doom is not proof you need 10 GB only the highest settings my require 10GB. You got to keep to the data. In future current cards both AMD and NVidia wont meet game requirements. ATM a card that 100% does not have enough vram is the RTX 2060 6GB. In Doom eternal 1080p RT on and you run out of vram and have to turn the texture pool down towards low. 10 GB you can run Doom ethernal very well.

Doom Eternal 4k RT on unltra nightmare quality, DLSS quality mode

 
Last edited:
Consoles are the poor mans gaming system!
They get the same game but with lores blurry tex while PC get the sharp Hi res tex!
Not anymore NVIDIA/AMD have released plenty of 12/16 GB cards, the old 8GB standard does NOT apply anymore.

That's not what I asked.

IF you're fine with lores tex in upcoming games 10GB is okay.
Every programmer I've talked to from different software houses is saying more large tex will be used going forward to provide more eye candy.

Why would they suddenly increase texture resolution when most cards have 10GB or less and the consoles have ~10GB to allocate to graphics?
 
Try it yourself get a game program code/resources from a friend/programmer and run it through a profiler to see what's taking up the most memory.

Well I have looked at memory usage in depth in a load of modern games specifically because of this thread. And as I've said vRAM budgets for the most recent games like Doom eternal show about 1/4th of the memory being used for texture pools, verified through engine variables. That's why budgets especially for more open world games are more complex today than they've ever been, because the engine doesn't keep a fixed set of textures in memory, they maintain a dynamic pool of memory for textures and stream textures in and out of that pool as you move through the game space.

Yes textures can take up a lot of memory but most surfaces in a game do not benefit from textures that high resolution. A 4096x4096 texture is nearly twice the size of a 4k gaming monitor, the only way you can appreciate the added detail is if it literally fills the entire screen (say a FPS game where you're face to face with a wall using a 4096x4096 brick texture). If you move, and that same wall now represents some patch on your monitor that's say 1/4 of the size (a 1080p patch at 4k) then textures above 2048x2048 wont show a benefit. And that's a worst case scenario it's extremely rare a texture will fill the entire screen.

That's why we mipmap textures to lower resolution variants, to display those low resolution variants on more distant objects or objects at obtuse angles. That's why engines have LOD systems and texture streaming and can throw out the really high resolution textures once objects that use them pass through the various LOD boundaries.

And that's IF you're using 4k which almost no one does. The vast majority of gamers are in 1080p for whom 2048x2048 textures are fine.

It's why if you look at UHD texture pack review here, there's loads of people saying it doesn't make a difference https://store.steampowered.com/app/...iege__Ultra_HD_Texture_Pack/#app_reviews_hash

That's why we can't have better looking games on PC, Nvidia is stopping the progress with their console VRAM.

But we do, the PC has always had extra high res texture packs the consoles don't have, for the devs that can be bothered to put in the extra effort. FO4, R6, WDL, KCD. They all work inside 10Gb just fine. Again including WDL which is on the next gen consoles but doesn't use the texture pack. The biggest impediment to better games on the PC isn't memory it's developers making 1 size fits all game and not putting additional effort into the PC. Some devs do and the PC variants look way better.
 
Last edited:
Yes textures can take up a lot of memory but most surfaces in a game do not benefit from textures that high resolution. A 4096x4096 texture is nearly twice the size of a 4k gaming monitor, the only way you can appreciate the added detail is if it literally fills the entire screen (say a FPS game where you're face to face with a wall using a 4096x4096 brick texture). If you move, and that same wall now represents some patch on your monitor that's say 1/4 of the size (a 1080p patch at 4k) then textures above 2048x2048 wont show a benefit. And that's a worst case scenario it's extremely rare a texture will fill the entire screen.

That's why we mipmap textures to lower resolution variants, to display those low resolution variants on more distant objects or objects at obtuse angles. That's why engines have LOD systems and texture streaming and can throw out the really high resolution textures once objects that use them pass through the various LOD boundaries.

And that's IF you're using 4k which almost no one does. The vast majority of gamers are in 1080p for whom 2048x2048 textures are fine.

It's why if you look at UHD texture pack review here, there's loads of people saying it doesn't make a difference https://store.steampowered.com/app/...iege__Ultra_HD_Texture_Pack/#app_reviews_hash
Everything in bold is just wrong as a blanket statement and completely ignores any nuance that goes into texturing models.

You can appreciate 4k textures on a 1080p monitor without having to rub your nose in it. Depending on the model 4k textures can give an improvement on a 1080p monitor.

The size of the texture needed is a factor of the model complexity/size and screen resolution.
 
Everything in bold is just wrong as a blanket statement and completely ignores any nuance that goes into texturing models.

You can appreciate 4k textures on a 1080p monitor without having to rub your nose in it. Depending on the model 4k textures can give an improvement on a 1080p monitor.

The size of the texture needed is a factor of the model complexity/size and screen resolution.

You're right that the situation is more complicated but the principle is sound. In the case you're talking about, you're texturing an object where most of the texture on the model is not visible to the point of view. A more accurate way of putting it would be that it would be pointless in having more than 1 texel per pixel. If you've got 3840x2160 pixels then you see no benefit to having more than that many texels visible on screen. (exceptions for MSAA/SSAA that in a sense take subpixel samples, but are almost never used today)

But the bottom line is that I'm talking absolutely worse case scenario, which rarely happens. In 3D spaces, objects appear smaller at greater distances, but due to the nature of the volume of a sphere and how volume increases as a function of radius cubed, you end up with vastly more objects at a distance on average than close by, assuming roughly uniform distribution. Imagine you're in open world Fallout 4 and you look out into the game world almost all of the objects are tiny in your field of view and use the low resolution mipmaps of the textures. Even objects quite close to you end up as a small total proportion of the pixels on the screen. Which is why there's limits on texture resolutions that result in improved visuals.

It's why a large percentage of gamers report no visual improvement for high res texture packs. And why they're typically labeled or advised for "UHD" or "4k".
 
That's not what I asked.



Why would they suddenly increase texture resolution when most cards have 10GB or less and the consoles have ~10GB to allocate to graphics?

They aren't suddenly increasing res it's happening with the release of NVIDIA/AMD 12/16 GB cards its gradually being phased in as more people own these 12/16 GB cards.
By the time say 10 months from now a new game is released with more Vram required 12/16 GB cards will be in lots of peoples hands, but not right now.
Don't expect too long though for these games to be released.
 
Last edited:
Well I have looked at memory usage in depth in a load of modern games specifically because of this thread. And as I've said vRAM budgets for the most recent games like Doom eternal show about 1/4th of the memory being used for texture pools, verified through engine variables. That's why budgets especially for more open world games are more complex today than they've ever been, because the engine doesn't keep a fixed set of textures in memory, they maintain a dynamic pool of memory for textures and stream textures in and out of that pool as you move through the game space.

Yes textures can take up a lot of memory but most surfaces in a game do not benefit from textures that high resolution. A 4096x4096 texture is nearly twice the size of a 4k gaming monitor, the only way you can appreciate the added detail is if it literally fills the entire screen (say a FPS game where you're face to face with a wall using a 4096x4096 brick texture). If you move, and that same wall now represents some patch on your monitor that's say 1/4 of the size (a 1080p patch at 4k) then textures above 2048x2048 wont show a benefit. And that's a worst case scenario it's extremely rare a texture will fill the entire screen.

That's why we mipmap textures to lower resolution variants, to display those low resolution variants on more distant objects or objects at obtuse angles. That's why engines have LOD systems and texture streaming and can throw out the really high resolution textures once objects that use them pass through the various LOD boundaries.

And that's IF you're using 4k which almost no one does. The vast majority of gamers are in 1080p for whom 2048x2048 textures are fine.

It's why if you look at UHD texture pack review here, there's loads of people saying it doesn't make a difference https://store.steampowered.com/app/...iege__Ultra_HD_Texture_Pack/#app_reviews_hash



But we do, the PC has always had extra high res texture packs the consoles don't have, for the devs that can be bothered to put in the extra effort. FO4, R6, WDL, KCD. They all work inside 10Gb just fine. Again including WDL which is on the next gen consoles but doesn't use the texture pack. The biggest impediment to better games on the PC isn't memory it's developers making 1 size fits all game and not putting additional effort into the PC. Some devs do and the PC variants look way better.

If you throw out a large tex when its out of LOD then its going to be SLOW to get back to VRAM over the pcie bus when you need it again.
 
Each to their own. We all have preferences. 1440p could be considered the sweet spot still. But as long as I am getting 40-60fps (depending on the game) then I would rather take the image quality. 120fps is nice and all, but not at the sacrifice of superior image quality for me.

My thoughts exactly. 1440p on a bigger screen is not sharp enough for me and I would much rather 4k at lower frame rates.
 
Status
Not open for further replies.
Back
Top Bottom