• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
It is very simple...it just obviously lacks the kind of shadow and lighting detail that people expect from modern games, there's only very basic shadows, I can't see any GI there's no surfaces picking up colours from nearby objects, there's definitely no kind of self shadowing on most of the objects, like buildings, there's obviously no SSAO or ambient occlusion of and kind. It's why it looks so flat and uniformly lit. I'd need to know the game and probably test it myself to see what is going on and what settings are really used. It's not a game I've ever seen before it looks like some amateur thing, not to offend whoever made it, it's just not really a good representative of a modern production ready game. It doesn't have much to do with screen resolution either, the primary culprit in why it kinda looks like one of those asset flip games is total lack of any real proper lighting and shadows.

Yes you're right about the GI its enabled but I didn't set any emissive materials so no GI showing!

2.jpg


it has soft shadows with pcss, are you talking about multi light sources and colored lights producing lots of shadows? There is 1 directional light for the sun, its an outdoor day scene.

10.jpg


Yep SSAO, is missing I took it out because I thought it was a ugly effect, but I put it back in to test your theory.

8.jpg


SSAO On

5.jpg


SSAO with more intensity

6.jpg


SSAO only view

7.jpg



However no real change to VRAM allocation in memory snapshot.

Without SSAO

9.jpg


With SSAO

4.jpg


Some other effects used are (note some turned off because I don't like ugly effects)

1.jpg


You really think using more light sources/shadows would have a huge increase in VRAM usage?
 
You really think using more light sources/shadows would have a huge increase in VRAM usage?

I showed quite clearly with RDR2 in the settings menu that most graphical setting have some impact on the vRAM usage and some of them are quite small, but if you turn them all on then it adds up to a big impact. Far bigger than texture usage. And you're showing me screenshots and video of an unknown game which looks very amateur with obviously bad lighting and bad graphical fidelity as an example. I mean you can see that from the screenshots, it looks like an N64 game. It's just a unity level where someone has gone to the unity store and bought some assets and put them all into a map without any kind of optimization.

Anyone can do this in fact I joked about doing this several times, making an asset flip game because it would allow me to just make an absurd game that exceeds memory budgets of AMD cards and that people using single games as examples of why a certain amount of memory isn't enough would either need to concede the point or acknowledge there's some kind of quality control with the games we use to test things.

I could do the very same thing and just go flip 10x more assets than that in a single level and drive memory usage up to 20GB and go ta-da here's a game that needs 20GB so none of the cards 16GB AMD cards are enough you need to have a 3090 with 24GB of vRAM. In fact you could basically make a game any way you wanted to prove any point you wanted to prove. I could make a trillion poly sphere in 3dsmax, export it with no texture, just a static colour and import that into the game world 5000 times and show a mesh budget of 10GB and a texture budget of zero, but so what? Showing a free unity engine crammed full of store bought assets basically means nothing. You can achieve any balance of numbers you like but we're obviously talking about published games by professionals who are balancing assets in a way as to get the best possible fidelity game with a certain budget of resources.
 
It is very simple...it just obviously lacks the kind of shadow and lighting detail that people expect from modern games, there's only very basic shadows, I can't see any GI there's no surfaces picking up colours from nearby objects, there's definitely no kind of self shadowing on most of the objects, like buildings, there's obviously no SSAO or ambient occlusion of and kind. It's why it looks so flat and uniformly lit. I'd need to know the game and probably test it myself to see what is going on and what settings are really used. It's not a game I've ever seen before it looks like some amateur thing, not to offend whoever made it, it's just not really a good representative of a modern production ready game. It doesn't have much to do with screen resolution either, the primary culprit in why it kinda looks like one of those asset flip games is total lack of any real proper lighting and shadows.

I just tried out what you said about lack of shadow/lighting, so I added 5 spot lights and 5 point lights so its 11 lights including the original directional light.
Light sources were all set with soft shadows and different colors.

Editor pic

15.jpg


Ingame pic

13.jpg


ingame pic

14.jpg


As you can see its a ugly colored mess with shadows all over the place.

Memory allocation snapshot

12.jpg


Render texture has increased a bit, about 100mb more but that's it, I'm not able to get more VRAM usage by adding lots of different real time colored lights and shadows.

Got a grab from windows task manager and gpuz, vram usage seems to tally with memory snapshot.

18.jpg
 
Last edited:
I showed quite clearly with RDR2 in the settings menu that most graphical setting have some impact on the vRAM usage and some of them are quite small, but if you turn them all on then it adds up to a big impact. Far bigger than texture usage. And you're showing me screenshots and video of an unknown game which looks very amateur with obviously bad lighting and bad graphical fidelity as an example. I mean you can see that from the screenshots, it looks like an N64 game. It's just a unity level where someone has gone to the unity store and bought some assets and put them all into a map without any kind of optimization.

Anyone can do this in fact I joked about doing this several times, making an asset flip game because it would allow me to just make an absurd game that exceeds memory budgets of AMD cards and that people using single games as examples of why a certain amount of memory isn't enough would either need to concede the point or acknowledge there's some kind of quality control with the games we use to test things.

I could do the very same thing and just go flip 10x more assets than that in a single level and drive memory usage up to 20GB and go ta-da here's a game that needs 20GB so none of the cards 16GB AMD cards are enough you need to have a 3090 with 24GB of vRAM. In fact you could basically make a game any way you wanted to prove any point you wanted to prove. I could make a trillion poly sphere in 3dsmax, export it with no texture, just a static colour and import that into the game world 5000 times and show a mesh budget of 10GB and a texture budget of zero, but so what? Showing a free unity engine crammed full of store bought assets basically means nothing. You can achieve any balance of numbers you like but we're obviously talking about published games by professionals who are balancing assets in a way as to get the best possible fidelity game with a certain budget of resources.

Just about every object/model has a lod and textures are compressed with mipmaps its very well optimised and finely tuned.

16.jpg





17.jpg



Textures filling up vram aside, I'm not able to produce any significant changes in Vram usage by removing/adding lots of effects like SSAO, more shadows/light sources its just not affecting Vram that much, maybe super sampling AA?
Yes you're right you can pack textures into vram to fill it up but the special effects dont't seem to be using more than some megabytes as far as I can see.
I forgot to add the 90% of the level was designed by Michael O who is a bit of a legend in game level design so its quite finely tuned, I hope he's not reading any of this Lol Unoptimized tosh Lol!

Yeah I get what you're saying that when you turn on/off and high/ultra options in the graphics menus then the vram meter goes up quite a bit.
Why don't you just set up something in unity or unreal engine and see what results you get.
 
Last edited:
Yes, 10GB of VRAM is enough in 2021. You ain't getting more than that this year anyway...

Replaced GPU sometime next year if needed.

/end discussion ;)
 
Mipmaps get generated as standard by any engine not just for performance benefits with the LOD system but because texture filtering is necessary to avoid rendering artefacts. You're not going to get significant shifts in memory usage by placing a few more lights. It's about a large number of graphical effects that go into modern rendering, each needing some additional memory usage. I've demonstrated this on a very nice looking game like RDR2, there's a very long list of options in the graphics menu that go into how good it looks, I went through them all and showed most of them have some additional memory overhead. And the counter example you're showing low memory usage for non-texture usage, but you only need look at the screenshots side by side there's an extremely stark difference between something like RDR2 and whatever you're showing.

compare.jpg


The left is using things like tessellation on the ground and on the trees, screen space reflections in the water/puddles, shelf shadowing on models, global illumination on most light sources, volumetric clouds/fog/explosions, screen space ambient occlusion, fur rendering on animals and hair, parallax occlusion mapping, motion blur, reflection/mirror effects, anti-aliasing, soft shadows, dynamic lighting on light sources that can move including time of day long shadows, physics simulation on trees/grass/water, lighting from particles etc.

My post here were I breakdown the additional memory usage for each of these is relatively small https://www.overclockers.co.uk/forums/posts/34927660 but you add them all up when you run the game maxed out and they account for the vast majority of the memory in use. I even have memory on screen showing the usage in my screenshot above.

In fact here's a comparison shot from the benchmark in roughly the same place with the lowest settings for everything, except for textures are still at Ultra so you can see the real memory difference all those combined changes have. 6768 MB real usage for all ultra and 3222 MB for all low except Ultra textures. Actually demonstrating that the menus memory measure for the settings is pretty darn accurate.

20210703223451-1.jpg


That's a delta of 3546 MB of memory. I'm not sure what you think is happening here? Is RDR2 some outlier? Or maybe you think they did a bad job on optimization at high settings and it's just wasting memory unnecessarily?

I think what's happening is something that I go back to a lot, which is that games did used to be like this for a long time. Games mostly used video memory to store assets for the levels, they'd cram that memory full and when they ran out, that was it. And we wanted more memory because we want more assets and higher quality assets. And then around the era of the original Crysis (or there about) we started seeing big open worlds with assets that stream in and out of memory that allowed games to just blow past memory limitations on video cards. If you zoned out your levels well your game map could have vastly more assets in it than fit into memory. Since then things have changed significantly, we've had huge advancements in all sorts of rendering effects and most of those things need their own buffers in memory. Textures and assets have still got better and use more memory, but their relative proportion compared to other effects has gone down and that's happening at a faster pace, especially now we have RT where BVH tables take up a substantial amount of memory. People have a lot of assumptions today about how these things work that are holdovers from gaming 10+ years ago that today just aren't the same. Which is kind of why you're seeing what you're seeing in your example, lots of models and textures, very sparse on other graphical effects, it visually looks like a game from 10+ years ago.

This is why i've moved away from the conventional wisdom of future games will use all this memory for assets and thus the amount of memory you need is tied to what the games demand. Towards a different paradigm where a lot of the memory is used for graphical effects and the amount of memory you need is tied to how powerful the GPU is. If your GPU is slow and can't run these effects at a decent frame rate you turn them down/off and it fees up memory. What we should be asking is, is 10Gb enough to service the GA102 GPU memory needs? The answer seems like yes. By the time you've filled that 10GB of memory like with say FS2020 which is one of the few games that gets close, the frame rate is in the toilet.

I appreciate you've gone to length to show metrics in whatever unity game that is, I take your point. But anyone can get free access to unity and drop in store bought assets and you can do so to achieve any numbers you like. What I'm talking is about is real commercial games using modern rendering techniques and leveraging all the optimization techniques to get the game looking as nice as possible for the least performance cost.
 
Mipmaps get generated as standard by any engine not just for performance benefits with the LOD system but because texture filtering is necessary to avoid rendering artefacts. You're not going to get significant shifts in memory usage by placing a few more lights. It's about a large number of graphical effects that go into modern rendering, each needing some additional memory usage. I've demonstrated this on a very nice looking game like RDR2, there's a very long list of options in the graphics menu that go into how good it looks, I went through them all and showed most of them have some additional memory overhead. And the counter example you're showing low memory usage for non-texture usage, but you only need look at the screenshots side by side there's an extremely stark difference between something like RDR2 and whatever you're showing.

compare.jpg


The left is using things like tessellation on the ground and on the trees, screen space reflections in the water/puddles, shelf shadowing on models, global illumination on most light sources, volumetric clouds/fog/explosions, screen space ambient occlusion, fur rendering on animals and hair, parallax occlusion mapping, motion blur, reflection/mirror effects, anti-aliasing, soft shadows, dynamic lighting on light sources that can move including time of day long shadows, physics simulation on trees/grass/water, lighting from particles etc.

My post here were I breakdown the additional memory usage for each of these is relatively small https://www.overclockers.co.uk/forums/posts/34927660 but you add them all up when you run the game maxed out and they account for the vast majority of the memory in use. I even have memory on screen showing the usage in my screenshot above.

In fact here's a comparison shot from the benchmark in roughly the same place with the lowest settings for everything, except for textures are still at Ultra so you can see the real memory difference all those combined changes have. 6768 MB real usage for all ultra and 3222 MB for all low except Ultra textures. Actually demonstrating that the menus memory measure for the settings is pretty darn accurate.

20210703223451-1.jpg


That's a delta of 3546 MB of memory. I'm not sure what you think is happening here? Is RDR2 some outlier? Or maybe you think they did a bad job on optimization at high settings and it's just wasting memory unnecessarily?

I think what's happening is something that I go back to a lot, which is that games did used to be like this for a long time. Games mostly used video memory to store assets for the levels, they'd cram that memory full and when they ran out, that was it. And we wanted more memory because we want more assets and higher quality assets. And then around the era of the original Crysis (or there about) we started seeing big open worlds with assets that stream in and out of memory that allowed games to just blow past memory limitations on video cards. If you zoned out your levels well your game map could have vastly more assets in it than fit into memory. Since then things have changed significantly, we've had huge advancements in all sorts of rendering effects and most of those things need their own buffers in memory. Textures and assets have still got better and use more memory, but their relative proportion compared to other effects has gone down and that's happening at a faster pace, especially now we have RT where BVH tables take up a substantial amount of memory. People have a lot of assumptions today about how these things work that are holdovers from gaming 10+ years ago that today just aren't the same. Which is kind of why you're seeing what you're seeing in your example, lots of models and textures, very sparse on other graphical effects, it visually looks like a game from 10+ years ago.

This is why i've moved away from the conventional wisdom of future games will use all this memory for assets and thus the amount of memory you need is tied to what the games demand. Towards a different paradigm where a lot of the memory is used for graphical effects and the amount of memory you need is tied to how powerful the GPU is. If your GPU is slow and can't run these effects at a decent frame rate you turn them down/off and it fees up memory. What we should be asking is, is 10Gb enough to service the GA102 GPU memory needs? The answer seems like yes. By the time you've filled that 10GB of memory like with say FS2020 which is one of the few games that gets close, the frame rate is in the toilet.

I appreciate you've gone to length to show metrics in whatever unity game that is, I take your point. But anyone can get free access to unity and drop in store bought assets and you can do so to achieve any numbers you like. What I'm talking is about is real commercial games using modern rendering techniques and leveraging all the optimization techniques to get the game looking as nice as possible for the least performance cost.
you probably have background stuff that uses vram

check them out

windows and other apps that use hardware accerelation apps can use up to 1.5-2 gb vram if they're unchecked
 
you probably have background stuff that uses vram

check them out

windows and other apps that use hardware accerelation apps can use up to 1.5-2 gb vram if they're unchecked

The metrics on my OSD are per application running, both the requested/allocated amount, and the real amount actually used. They ignore other usages.

The metrics that I gave from RDR2s menu is just what the game uses. RDR2 actually does specify in the game menu both the total used, and the breakdown of what the game is using vs what other apps are using (which is very cool of Rockstar) and I've always quoted purely what the game is using. So none of that is a problem with any of the numbers I've discussed.
 
Lots of mental gimnastics going but no examples of the 3080 running out of memory other than some 3d amateur stuff, meanwhile the 3080 is pretty much running every single commercial game on The market just fine.
 
You can just use the VRAM allocation tools in game banchmarks such as Tomb raider, I think the Far Cry games have one - Tom Clancy Ghost recon games. Many of the settings in games tell you which ones will impact GPU utilization - particularly those that impact Vram.
 
Why are you showing a denoiser video, it cleans up noise after render? It won't make game RT produce soft reflections or other quality effects.

Denoiser removes the need for high SPP. You get the same image but with far lower SPP. Its the whole reason real-time RT exists and why you can get great RT (path-tracing) lighting in Metro Exodus enhanced edition with infinite bounces (with limits). It also reduces the rendering time in off line renders, by massive amounts. Nowadays a denoiser is a must. Current demos are better than the image, they are photo realistic already. The whole real-time graphics is going AI as well.

This is an AI demo is future graphics. Note how far better it looks compared to 3dsMax above, how it look close to real life.


Two years old.


Current real-time computer graphics. Unreal engine 5. Unreal Engine 5 Real-Time Nanite\Lumen K-750 Soviet Moto Demo. Look how this is massively better looking than the 3dsmax image. How photo-realistic it looks.


https://youtu.be/xfpXN6mgfew
 
Last edited:
Denoiser removes the need for high SPP. You get the same image but with far lower SPP. Its the whole reason real-time RT exists and why you can get great RT (path-tracing) lighting in Metro Exodus enhanced edition with infinite bounces (with limits). It also reduces the rendering time in off line renders, by massive amounts. Nowadays a denoiser is a must. Current demos are better than the image, they are photo realistic already. The whole real-time graphics is going AI as well.
Denoisers allow you to use less samples than you would have used without them. To say that they remove the need for high spp is a vague statement that may not be correct in certain situations (what is a high sample count?). Same with your statement about low SPP.

If your sample count is too low you will end up with a very poor quality image. Literally rubbing vaseline on the screen type quality. The more samples a denoiser has to work with the better the results up to a certain point.

Generally denoisers are good for the last mile type situations. 95% of the image is done and you need to remove those last fireflies in the remaining 5%. Sometimes getting rid of the last 5% can take longer than getting to 95% in the first place.

P.S. Infinite bounces (with limits) is an oxymoron
P.S.S Just because something looks photoreal doesn't mean it is better, especially if that is not what the artist tried to achieve in the first place.
P.S.S.S Materials in CGI have advanced significantly over the past few years, so any attempts at photorealism in the past look dated compared to modern modern material setups.

Edit: If people are interested i can do a comparisons of different sample count on a simple scene so people can see how denoisers work with these different sample counts in an offline render engine.
 
Status
Not open for further replies.
Back
Top Bottom