• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Yes Nvidia has a huge market share compared to AMD but 10GB 3080 should last for a bit say a year. The developers are from major software houses!
They (the bosses) are pushing for more eye candy to draw more gamers to their titles when they show off their video trailers so the old 8GB Vram standard has to go.

Of course they are going for more eye candy. We have raytracing, NVMe streaming and cloud. This does not mean we need more than 10GB of VRAM. Indeed such tech all reduce the amount of VRAM required for seamless play.
 
Last edited:
Of course they are going for more eye candy. We have raytracing, NVMe streaming and cloud. This does not mean we need more than 10GB of VRAM. Indeed such tech all reduce the amount of VRAM required for seamless play.



Get a card with the most Vram you can afford so it lasts a bit longer.
 
Last edited:
So would you recomend a 3060 over a 3080?

Ha ha, 3060 doesn't have the muscle to run a 4k screen very well! You could run the 3060 in 1080p with the hires tex on, that would work but a LOT of color detail lost from the large tex in 1080p mode.

Get a 3080Ti, RX6800, RX6800XT, 3090 (3090 should last a good few years, industry is not going to to be using up 24GB that fast).
Don't buy a 8GB card, 3080 should last say 1 or 2 years but be expecting to turn down hires textures as games move on.
 
The 3090 is double the price though, you could buy a 3080 and with the price difference buy a 4080 next year which will likely be 40% faster than a 3090.

I wouldn't buy a 8gb card but I'm fine with 10gb.
 
The 3090 is double the price though, you could buy a 3080 and with the price difference buy a 4080 next year which will likely be 40% faster than a 3090.

Yep most likely correct, but a 4080 may be £3000 or £5000 as commodity inflation pushes up!
It's your choice, a 3090 now will last a good few years, you could wait but you're looking at unknown 4080 prices, the new commodity upcycle is only getting started plus we need more fabs for semiconductors which will take a few years to build.
 
Yep most likely correct, but a 4080 may be £3000 or £5000 as commodity inflation pushes up!
It's your choice, a 3090 now will last a good few years, you could wait but you're looking at unknown 4080 prices, the new commodity upcycle is only getting started plus we need more fabs for semiconductors which will take a few years to build.
I doubt it as nvidia will have to compete with all the cheap used mining cards that will flood the market next year.
 
I'm assuming we are talking about DLSS 2.0. Unless you won't to lay out arguments as to why DLSS 1.0 is better than FSR.

There is only one comment on that list that states FSR is better than DLSS 2.0 and that is from some random nobody. But i guess i was technically wrong; you didn't make it up.
The rest of the comments are literally irrelevant.



So what, when did Nvidia become immune from being mocked? Especially on a cash grab product.

If you are reviewing a card, mocking it leaves you open to claims of bias, its unprofessional thats all. Not looking to hijack the thread about it.
 
Last edited:
Hey guys,
I think I have got a good answer to how Vram is allocated in a typical game.

Check this out:
http://katmai3.co.uk/1/gameprofile.mp4

Download the video or watch it if your connection is fast enough.
This is me profiling a typical game world level live, look at about half way through the video I use the profiler then after that I take a memory snapshot all while the game is running live.
The memory snapshot shows what's in Vram and what takes the most of the memory.
As you can see the 2D texture maps (with mipmaps etc) take a huge chunk of the Vram, then meshes. the 179 shaders Don't take much Vram.

For people claiming I will turn down a few graphical effect to cut Vram usage, it Won't work! You have to turn off hires textures so you go from 4k tex (or a mix of 4k/2k) to no 4k tex = more blurry if viewed on a 4k monitor!

The caches in there using Vram are not significant and are dynamically resized or made very small if a programmer wishes to almost totally fill up Vram.

NOTES
--------
I had to drop down to 1080p screen to record this due to file size so it looks blurry!
The level was created by an industry pro artist so all the proper LOD's, mipmaps, texture compression are in there, this was about 5 years ago so todays games will have much more large textures.
Its mostly 2k (2048*2048) textures with a few 4k tex, I have put in a few 4k tex which are used for the female soldier and left them Uncompressed so you can see the size in Vram.
Some larger meshes were added by me so triangle counts and Vram use would be a bit less for a game from 5 years ago but not for todays games.
The 179 shaders are a huge amount such as ambient occlusion, anti aliasing, particle effects, terrain shaders etc but ONLY use 127 MB of Vram so people claiming to turn down a few graphical effects to
reduce Vram usage won't be able to.

Get a card with the most Vram you can afford so it lasts a bit longer.

A 3080 or a 6900xt wont have the firepower in future to run at 4k. Take Metro Exodus enhanced edition, a 6900xt cant run that at 4k. Cyberpunk 2077 a 6900xt cant run it at 4k with RT on.

rx-6900-xt-bile-cyberpunk-ray-tracing-modunu-kaldiramiyor-technopat-1.jpg


I am playing the start of Cyberpunk 2077 right now at 4k on a 2060 with DLSS 2.2.9 at ultra performance. I expect at some point I will have to lower the resolution but its performance is better than a 6900xt at 4k. The RTX 2060 has 6GB of VRAM.
What will happen when even more demands are made on the current GPUs? The 6900xt is not a 4k card. It has 16GB's of VRAM but lacks the performance. If the future uses AMD tech and console only type games then AMD could do 4k but if the future games are heavy in RT then the 6900xt is not a 4k card.
Basically Unreal Engine 5 games will run with Luma and Nanite. So will most PC GPUs. Top end PC games are going to add RT on top. Basically very few cards are going to run future full RT games at high resolution like 4k.

If we talk about a future with Cyberpunk 2077 type games then 4k is not possible native on a 6900xt. If we talk about games like Dirt 5 or GodFall then you can run games at 4k on a 6900xt. VRAM here does not appear to be the main problem with RT.

So what do AMD think?

The Radeon RX 6700 XT feature 12GB of VRAM to give gamers "peace of mind when you click that Ultra Preset", claims AMD's Scott Herkelman. Today's games use more memory, especially when you're looking for the best texture quality and visual fidelity. Radeon RX 6700 XT is equipped with 12GB of VRAM for peace of mind when you click that Ultra preset.

List of games benchmarked with the RTX3080.
  • Marvel’s Avengers
  • Cyberpunk 2077
  • Watch Dogs Legion
  • Call of Duty Warzone
  • Crysis Remastered
  • Project CARS 3
  • Horizon Zero Dawn
  • Red Dead Redemption 2
  • Quantum Break
  • Borderlands 3
  • CONTROL
  • Assetto Corsa Competizione
  • Kingdom Come Deliverance
  • Ghost Recon Wildlands
  • Anthem
  • Metro Exodus
  • Assassin’s Creed Valhalla
  • Immortals Fenyx Rising
  • Godfall
Call of Duty Warzone (with Ray Tracing enabled) VRAM usage to 9.3GB.
Watch Dogs Legion required 9GB at 4K/Ultra (without Ray Tracing).
Marvel’s Avengers used 8.8GB during the scene in which our RTX2080Ti was using more than 10GB of VRAM.
Crysis Remastered used 8.7GB of VRAM
Quantum Break used 6GB of VRAM. All the other games were using between 4-8.5GB of VRAM.
Is there a game that cant run at 4k maximum settings on a 3080?
https://www.dsogaming.com/articles/are-10gb-of-gddr6x-of-vram-enough-for-4k-ultra-gaming/
 
Things may flip next gen though if AMD can nail the MCM design while Nvidia are stuck on monolithic. We could see AMD pulling a gen ahead in raster while also matching Nvidia in RT.

This is exactly why I wouldn't be commiting to an expensive 3090/6900XT right now as they could end up look very weak a year from now and while the 3080 only has 10gb of VRAM atleast it's cheap (if you got one at MSRP) and will likely only lose a couple of 100 quid in resale value when next gen arrives while the 3090/6900XT maybe closer to £1000 in loses.
 
Things may flip next gen though if AMD can nail the MCM design while Nvidia are stuck on monolithic. We could see AMD pulling a gen ahead in raster while also matching Nvidia in RT.

This is exactly why I wouldn't be commiting to an expensive 3090/6900XT right now as they could end up look very weak a year from now and while the 3080 only has 10gb of VRAM atleast it's cheap (if you got one at MSRP) and will likely only lose a couple of 100 quid in resale value when next gen arrives while the 3090/6900XT maybe closer to £1000 in loses.

Its all down to how gmes are developed. Unreal Engine 5 does a lot in software. Luma is basically a software GI replacement for hardware RT. Also the features like RT i/o are a part of Unreal Engine 5.

https://www.techspot.com/news/85495-ps5-ssd-fast-epic-had-rewrite-parts-unreal.html

“The ability to stream in content at extreme speeds enables developers to create denser and more detailed environments, changing how we think about streaming content. It’s so impactful that we’ve rewritten our core I/O subsystems for Unreal Engine with the PlayStation 5 in mind,” he added.

Both AMd and NVidia have upscaling tech in Unreal Engine 5. AMD can use Temporal upscaling and NVidia DLSS. This is about upscaling a 1080p image to 4k. There are rumors on websites that an AMD 6800xt is 5-12% faster than the 3080 in the demo. https://youtu.be/C99VwDGyLg0

Unreal engine 5 uses lots of very high resolution textures. Quixel textures are stated to be as big as 8k.

Really a future game on Unreal Engine 5 would be a good place to begin to ask the question about vram size in future. We know the game is likely be fine because of the amounts of VRAM available on consoles but in the PC space once you add in very high quality Quixel textures. How does vram size and the streaming ablities of features like RTX i/o fit into the calculus.

Will performance in software RT(I know its not quite RT) using Luma GI be the most use system and as it looks at the moment benefits AMD. Or will the PC space use more hardware RT effects? Likely to the benefit of nvidia. As we know Quixel textures are used, would streaming using something like RTX i/o really help reduce vram needed? Or will RTX i/o only help with loading new areas and loading in general but not with reducing vram requirements. With Unreal Engine 5 FSR is not really an issue (because of its poor image quality). Temporal upscaling is used. Now the demo uses this method for both nvidia and amd. How will a future DLSS version affect this outcome. Will this reduce the 5-12% higher performance of an AMD 6800xt over a RTX 3080 seen in the demo. Will NVidia pull ahead if hardware RT is used to improve quailty? Will NVida cards not have enough vram for quality settings above console? Will DLSS mean that nvidia produces a better 1080p to 4k upscaled image. Thus better quality over AMD.

Reality is more complicated than just one number. Raster performance when games are RT based means little. Its what the future games use is what matters. If its more cyberpunk 2077 and Metro Exodus enhanced edition then raster performance is pointless. If its more of the Unreal Engine 5 demo then things could go the other way. Dirt 5 and Godfall for example.

Yes NVidia have a better balance of performance between raster and RT but if the games are optimised for performance on AMD cards then extra RT performance is worthless. If Unreal Engine 5 games use RT hardware to improve quality above Luma then NVidia could well offer better image quality over AMD.
 
Last edited:
Yep I got a very good example.
http://katmai3.co.uk/1/gameprofile.mp4

Download the video or watch it if your connection is fast enough.
This is me profiling a typical game world level live, look at about half way through the video I use the profiler then after that I take a memory snapshot all while the game is running live.
The memory snapshot shows what's in Vram and what takes the most of the memory.
As you can see the 2D texture maps (with mipmaps etc) take a huge chunk of the Vram, then meshes. the 179 shaders Don't take much Vram.

What game is this? It looks like some weird unity asset flip game you find on the steam store for 2 pounds. If you look at the visuals they're awful, there's obviously no advanced rendering effects being used there at all, there's shadows on some objects but it's almost completely uniformly lit. What settings are being used here in terms of lighting, filtering, Anti-aliasing, global illumination, dynamic lighting, etc

Way back near the start of the 8/10GB threads someone posted results of memory used in I think Doom Eternal based on a tool that can hook the games exe and inspect the entire rendering pipeline, what textures are in memory and stuff lik ethat. I downloaded that and tested it back when testing the resident evil claims of memory usage, but I must have uninstalled it and I can't for the life of me remember what it's called. I want to grab this again and just run though some games and make a more detailed memory used on texture comparison. Because something doesn't add up here, many of these AAA games are getting away with relatively small texture pools relative to total memory usage and I just want to test if that's accurate or not when setting these via engine vars.

if someone can remember the name then PM or @me please.
 
Last edited:
Let me switch tac. Do you have any examples to back up the claim of textures being mostly what is in vRAM?
The order 1886 an early PS4 game

This leaves about 4.5 GB-5 GB available for developers. The Order: 1886 had the following memory budget.

  • 2 GB texture budget shared between environments, characters, and props
  • Of this, 600 MB or a little more than 25% is dedicated to Character Textures
  • 700 MB level geometry
  • 700 MB animation
  • 250 MB Global Textures (FX, UI, light maps. Characters even had light map data)
  • 128 MB Sound
That comes to a grand total of a 3.8 GB memory budget. It’s a pretty large budget and one that fits within the PS4’s available RAM

Breaking Down The Order: 1886. By: Daniel Rose | by Daniel Rose | GameTextures | Medium

Textures have always been known to be the biggest user of VRAM. What else would it be? frame buffers? Geometry? Sound?

The last one is a joke.
 
Last edited:
The order 1886 an early PS4 game

Breaking Down The Order: 1886. By: Daniel Rose | by Daniel Rose | GameTextures | Medium

What did you think is the largest usage of VRAM? frame buffers?

The single largest use of vRAM, by category of visual effect, is probably textures in a lot of cases. But it doesn't mean it's the majority of the vRAM usage. Many other graphics effects each using a bit can add up to a lot.

You can tinker with what settings add to vRAM usage by looking at games that give vRAM metrics in the graphics menus. Games like FC5, Doom Eternal, RDR2, GTAV, RE3, CoD Cold War and others. I'll run through RDR2 quickly as it's the only example I have installed. This is starting with settings at the absolute lowest the game will allow and then going through them 1 by 1 and turning them to max and noting ones which increase the video memory numbers.

Initial usage 1539 MB
Change Screen resolution to 4k -> 2778 MB
Change Triple Buffering on -> 2816 MB
Change Texture Quality to Ultra -> 3891 MB
Change Global Illumination quality to Ultra -> 3884 MB
Change Shadow quality to Ultra -> 4282 MB
Change Far Shadow Quality to Ultra -> 4284 MB
Change Screen Space Ambient Occlusion to Ultra -> 4334 MB
Change Reflection Quality to Ultra -> 4938 MB
Change Water Quality to High -> 5182 MB
Change Volumetrics Quality to Ultra -> 5333 MB
Change TAA to High -> 5340 MB
Change Unlocked Volumetric Raymarch Resolution On -> 5363 MB
Change Particle Lighting Quality to Ultra -> 5381 MB
Change Full Resolution screen space Amibent Occlusion On -> 5427 MB
Change Water Physics quality to Max -> 5547 MB
Change Reflection MSAA to x8 6416 MB

So we start with 1539 MB, and end with 6416 MB an increase of 4877 MB. The Ultra textures account for 1075 MB of the increase. The biggest single increase was the screen resolution. Although we do have to factor into account that of the initial 1539 MB measured, some of that will also be texture data, but some will also be from the other settings which bottom out at low, rather than off entirely. This is what I mean when I say it's like 1/4 of the vRAM usage, that's a pretty reasonable estimate in this case.

Just for transparency and accuracy, I skipped over 2 settings, MSAA and resolution scale. Both of these cause insane leaps in the amount of vRAM required when combined with high resolutions like 4k because their impact is multiplied by the number of pixels. You can actually get usage up to 39,048 MB with these enabled/maxed but performance would be trash, so left them out for obvious reasons.

*edit*
Regarding the Order I don't know how to verify the accuracy of those claims. The do mention in the article that whatever RAM was left was probably used for other tasks within the game...that's awfully vague. Presumably we just don't have the real usages here, we just have budgets you need to stay inside of.
 
Last edited:
@PrincessFrosty
Your results would depend on what section of the game were you in.
Resolution can also change other internal settings. e.g. low at 1080p may not be the same as low at 4k
A 4k image with a bit depth of 24 is about 23 MB. Even with multiple frame buffers you would have a hard time reaching the value you have there. :|

I think 1/4 usage is probably on the low end and not the standard for a majority of game.
 
The single largest use of vRAM, by category of visual effect, is probably textures in a lot of cases. But it doesn't mean it's the majority of the vRAM usage. Many other graphics effects each using a bit can add up to a lot.

You can tinker with what settings add to vRAM usage by looking at games that give vRAM metrics in the graphics menus. Games like FC5, Doom Eternal, RDR2, GTAV, RE3, CoD Cold War and others. I'll run through RDR2 quickly as it's the only example I have installed. This is starting with settings at the absolute lowest the game will allow and then going through them 1 by 1 and turning them to max and noting ones which increase the video memory numbers.

Initial usage 1539 MB
Change Screen resolution to 4k -> 2778 MB
Change Triple Buffering on -> 2816 MB
Change Texture Quality to Ultra -> 3891 MB
Change Global Illumination quality to Ultra -> 3884 MB
Change Shadow quality to Ultra -> 4282 MB
Change Far Shadow Quality to Ultra -> 4284 MB
Change Screen Space Ambient Occlusion to Ultra -> 4334 MB
Change Reflection Quality to Ultra -> 4938 MB
Change Water Quality to High -> 5182 MB
Change Volumetrics Quality to Ultra -> 5333 MB
Change TAA to High -> 5340 MB
Change Unlocked Volumetric Raymarch Resolution On -> 5363 MB
Change Particle Lighting Quality to Ultra -> 5381 MB
Change Full Resolution screen space Amibent Occlusion On -> 5427 MB
Change Water Physics quality to Max -> 5547 MB
Change Reflection MSAA to x8 6416 MB

So we start with 1539 MB, and end with 6416 MB an increase of 4877 MB. The Ultra textures account for 1075 MB of the increase. The biggest single increase was the screen resolution. Although we do have to factor into account that of the initial 1539 MB measured, some of that will also be texture data, but some will also be from the other settings which bottom out at low, rather than off entirely. This is what I mean when I say it's like 1/4 of the vRAM usage, that's a pretty reasonable estimate in this case.

Just for transparency and accuracy, I skipped over 2 settings, MSAA and resolution scale. Both of these cause insane leaps in the amount of vRAM required when combined with high resolutions like 4k because their impact is multiplied by the number of pixels. You can actually get usage up to 39,048 MB with these enabled/maxed but performance would be trash, so left them out for obvious reasons.

*edit*
Regarding the Order I don't know how to verify the accuracy of those claims. The do mention in the article that whatever RAM was left was probably used for other tasks within the game...that's awfully vague. Presumably we just don't have the real usages here, we just have budgets you need to stay inside of.

You can get any value on VRAM you like really because games are designed to live within hardware limits. Most GPU's are going to be 6-8GB, so all games for now will live within these limits. New console around 10 GB. Given these limits a 10GB VRAM size will have no issues running new or future games for some time. Given upscaling is a thing, 1080p to 4k the most likely amount. Most games will likely be fine for textures. If I can upscale from 720p to 4k in cyberpunk 2077 on a 6GB card with all the effects turned on and RT set to medium. It really does come down to game design and features used. The internal resolution and the final upscaled resolution.

All this thread will profit you is the conclusion that more is better because simply people are programed to accept that and will be the conclusion of the ignorant. You cant avoid it. More must be better.

You would have to check every game in development and in that poll to find the target vram size for the highest settings for that game. It can be the case that the highest setting are not ment for the current hardware. You will always 100% of the time find 10GB is enough for the game.

Basically you will all be head banging this forever because you dont have the inforation to answer even basic questions.

The truth is 10GB will be enough because that is the hardware limit on current hardware and its a design requirement for the game to function. The question itself is born of prue ignorance, will be argued in ignorance and die in ignorance.
 
Unreal engine 5 uses lots of very high resolution textures. Quixel textures are stated to be as big as 8k.

Really a future game on Unreal Engine 5 would be a good place to begin to ask the question about vram size in future. We know the game is likely be fine because of the amounts of VRAM available on consoles but in the PC space once you add in very high quality Quixel textures. How does vram size and the streaming ablities of features like RTX i/o fit into the calculus.

This is all wrapped up into a dynamic LOD system which they talk about in the tech preview of the Unreal5 engine here https://youtu.be/d1ZnM7CH-v4?t=410 at 6:50 where assets are loaded/unloaded as necessary.

Nanite uses a dynamic LOD system that aggressively adjusts the quality of the assets given your distance from them. I was learning about this recently, there's a good breakdown of how cluster culling works here with an awesome visual illustration of what is going on https://youtu.be/P65cADzsP8Q?t=98 at 1:38

We also know that for the PS5s super fast custom drive they actually overhauled the I/O for the engine because it was basically old and limited, and so that'll be something they look to push on the PC through DirectStorage to make streaming that content faster and so more seamless. The faster you can do it the more tight you can be with with the LODs.
 
Status
Not open for further replies.
Back
Top Bottom