• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
@ Both of you
I understand that they allocate memory in chunks, however in the FS2020 example it is an extra 32%. That seems way too excessive to not be intentional.
Are they not able to dynamically change the memory allocation or do they have to allocate it all up front? I am assuming that it is dynamic (up to a certain point).

This extra memory that is allocated is their a perfomance hit if it cannot be allocated? for example with FS2020. Lets say you have two identical GPUs but one with 10GB of VRAM (enough for the 9.5GB that is used) and the other GPU with 13GB of VRAM (to hold the extra 3GB of bloat). Would these two GPUs have identical performance?

Do the engine engineers and the game devs not talk to each other. It seems to me that it would be in everyones best interest to ensure optimum performance of games (Ignoring the time and budget constraints). This sounds like an inefficient way to collaborate. I really hope that this isn't standard practise in the industry.

Why do you know this? Based on what?
Define playable? 60fps or 30fps?
While it will take more processing power to handle the extra data. How do you know that the reduction in frame rate from the increased demand on the GPU is greater than the reduction in frame rate from running out of VRAM?

The memory allocation is purely based on the developers implementation of the game engine, they're free to request more or less whatever they want from the GPU and driver stack and then have their own internal complex rules about how that memory is used. Yes you can dynamically scale up and down memory allocation based on demand but it's better that it's not done very frequently, mostly because other apps/processes can use vRAM and if you're designing a game engine that intends to use most of what is available you don't want to faff about increasing and decreasing allocated memory, there's a risk any memory is released could be assigned somewhere else. It also forces the GPU to do a bunch of memory management which is best done infrequently.

In the case of say 2 hypothetical identical GPUs one with 10Gb and one with 13Gb you'd find the performance is the same as long as the "bloat" is basically memory that has been allocated but not filled. 2 things can happen in that circumstance, the developers rules around memory allocation can simply notice there's less available vRAM on the 10Gb card and be more conservative about their allocation, so don't over provision as much. Or because we now have unified memory they can attempt to allocate that much memory anyway, and let the GPU itself handle swapping assets from disk to vRAM. As long as that over provisioned memory isn't filled with useful stuff, there's no performance penalty there either.

In most cases game engines are so sophisticated that a team of engineers work on them and often just licence the engine to game development studios who are mostly just artists who implement them tools. And memory management is kinda abstracted away from them, it's not as efficient in terms of possible performance but it means games are cheaper, not everyone needs to write their own engine. So for example an engine like unreal engine deals with memory allocation, and then provides mapping tools to devs to make use of that, and a mapper can add in transition/loading zones to areas in which new area loads in, with all the assets, while the old one is dumped out of memory. The devs don't deal with that process themselves it's just too complex, they just get a set of tools. The hardware/drivers/API/Engine/Game/design are all so complex these days you need massive amounts of abstraction to allow them to all work together seamlessly and that has performance implications.

"Playable" for me is really an average of 45fps minimum, that's pretty much an acceptable floor for performance and I don't think that's terribly out of the norm, most people like to have 60fps average if they can, there's almost always input/control issues in games below 45fps. 30fps on the PC is very much a no go for almost all gamers, that's way outside the expected frame rate for most gamers, I don't feel that's very controversial. These games I'm talking about being unplable are running at like 17fps or 24fps.

You can test the reduction in frame rate due to vRAM limitations by using other cards that have a lot more vRAM to begin with but similar GPU grunt, like a Titan RTX which also gets slayed in FS2020 @4kUltra. But also it's worth looking at frame averages. running out of vRAM causes stuttering as assets are swapped from disk to vRAM and back, they are short pauses where one single frame may take a second or so to load, but then FPS average after that will be close to normal. It's not like it's an across the board average frame rate drop, it's more a case of stuttering starts happening based on your movement through the game world and the assets being needed in different places. So you can normally tell which is happening.
 
Has anyone tested them directly?

I know the 3080 can run into trouble with 4k texture packs in some games like doom eternal or Skyrim mods
 
not happening. The point is to go and test a game using real 4k texture packs and see if 10gb is enough
“real 4k texture packs” - you mean a 3rd party mod? Never used one and never will so not interested. My 3080 will be fine at 4K for my gaming use for as long as I have it. It’s a really impressive GPU - absolutely love it.
 
“real 4k texture packs” - you mean a 3rd party mod? Never used one and never will so not interested. My 3080 will be fine at 4K for my gaming use for as long as I have it. It’s a really impressive GPU - absolutely love it.
+1 Well said.

But be ready, as soon as a game comes out that cannot run at maximum texture setting at 4K, Grim and Poneros will be going into overdrive mode spreading the word and saying I told you so! :p
 
Has anyone tested them directly?

I know the 3080 can run into trouble with 4k texture packs in some games like doom eternal or Skyrim mods

It does? Do you have any sources for that? Its entirely what I'd expect though vanilla settings are fine for now install some hi res texture mods and you can expect problems. Vram is the big limitation with those and why I'll be steering well clear until the 20gb versions drop later this year or next, you can expect people invested in these cards to defend them to the nth degree though.
 
It does? Do you have any sources for that? Its entirely what I'd expect though vanilla settings are fine for now install some hi res texture mods and you can expect problems. Vram is the big limitation with those and why I'll be steering well clear until the 20gb versions drop later this year or next, you can expect people invested in these cards to defend them to the nth degree though.

I still don't see the big deal about these texture packs ...

For one, high res textures look out of place on low poly models. But also, if you're properly implementing higher res textures you would also make sure there's some LOD going on, so a texture far away that doesn't need to be so large isn't so large and consuming all the VRAM, making it necessary to buy a much more expensive GPU than you'd need in a more sane scenario - I don't think texture pack mods can achieve that?

Besides that though, when do you actually see all these textures? Are you going around, taking and looking at screenshots or playing the game? Most of the time playing a game I hardly have time to "appreciate" the textures (though yes, you do easily notice it when the textures are really low res).
 
PGCH or however you spell it did a bunch of 8k review testing.

The 3090 is up to 300% faster than the 3080 at 8k, the cause of this gap is that 10GB VRAM sucks.
 
PGCH or however you spell it did a bunch of 8k review testing.

The 3090 is up to 300% faster than the 3080 at 8k, the cause of this gap is that 10GB VRAM sucks.


More tosh being spread by someone without a clue, the 8k is already proven to not even be 8k as claimed by Nvidia (As gamer nexus said "Nvidia lied"), and the 3080 is not built to run 8k even if you could afford the 30k to buy an 8k panel.... Get a grip on reality!


The amount of stupid being still peddled about 10GB of 6X not being enough baffles me, ok if it's not enough you wait for the 20gb pay an extra £££ for it and get the same performance, mean time us suckers will happily game away on anything during the product life up to 4k on this 3080 and perhaps buy in for 8K when the 5080 comes around....



The real question is why did Nvidia even use 6X? When they could have used normal speed Vram with 16GB to achieve the a slightly better performance and much greater reduced cost AND way less power draw!
Was using 6X just a marketing tool to try and trick/Hype people into staying away from AMD and buying into the 3080??
 
More tosh being spread by someone without a clue, the 8k is already proven to not even be 8k as claimed by Nvidia (As gamer nexus said "Nvidia lied"), and the 3080 is not built to run 8k even if you could afford the 30k to buy an 8k panel.... Get a grip on reality!


The amount of stupid being still peddled about 10GB of 6X not being enough baffles me, ok if it's not enough you wait for the 20gb pay an extra £££ for it and get the same performance, mean time us suckers will happily game away on anything during the product life up to 4k on this 3080 and perhaps buy in for 8K when the 5080 comes around....



The real question is why did Nvidia even use 6X? When they could have used normal speed Vram with 16GB to achieve the a slightly better performance and much greater reduced cost AND way less power draw!
Was using 6X just a marketing tool to try and trick/Hype people into staying away from AMD and buying into the 3080??


As for using GDDR6X i am not sure but would say it's big numbers would have wowed peeps. I cannot tell if it helps with performance yet as my 3080 isn't being pushed by any game i have at 4k except...

One outlier game which i have found much smoother and less graphical glitches in is Star Citizen ( i know don't laugh!). I wonder if the incredible bandwidth of GDDR6X just helps smooth over such glitches. Pure guess but SC has always been a bugger to run i.e. need a SSD and 32Gb of ram to get anything like nice performance. As time goes on it might start showing how GDDR6X helps if more scenarios but the moment so few people have a 3080/3090 its tricky to get a good view.
 
24 GB is sometimes better than 10 GB
While the test of the GeForce RTX 3080 assumed that 10 GB is always sufficient compared to 8 GB in the course, the GeForce RTX 3090 shows in individual cases that this is not the case with the percentile FPS.
The plus in Ghost Recon Breakpoint is surprisingly large, where the GeForce RTX 3090 is clearly 22 percent ahead of the GeForce RTX 3080. In the end, this game requires more than 10 GB of memory in Ultra HD with maximum details, but it is the only title on the course that is so responsive.

https://www.computerbase.de/2020-09...si-test/2/#abschnitt_benchmarks_in_3840__2160
 
Told you both Grim5 and Poneros will be on a crusade scouring the web for any tiny bit of info to suppport their viewpoint that 10gb is not enough and spread the word. They proved me right within hours :D


More tosh being spread by someone without a clue
+1 :D
 
Over double the cost of my 3080 for 11% more performance?! The 3080 is still 4K gaming king when cost is taken into account. In 2 years i'll sell my 3080 and put the money towards a 4080 which will blow the 3090 away- all for less than a 3090 today.
Exactly. But 10gb not enough!

Who gives a ****. What a pointless thing to go on a crusade about. Lol. First world problems :p
 
Told you both Grim5 and Poneros will be on a crusade scouring the web for any tiny bit of info to suppport their viewpoint that 10gb is not enough and spread the word. They proved me right within hours :D



+1 :D

Poor 3080's minimum framerate 300% worse
 
Status
Not open for further replies.
Back
Top Bottom