I don't think there's anyone I've read that thinks it's impossible to go over 10Gb of usage. That's a straw man of way more reasonable positions that people have actually taken on this issue, which is right now we're not using more than 10Gb for anything and likely won't in the near future because you become GPU bottlenecked long before you're vRAM bottlenecked.
Kind of like the straw man were people state that having more VRAM doesn't provide more performance even though nobody has made that argument. But that didn't stop you or other parroting it, now did it. I guess i shouldn't stoop to the level of others, by exagerating my posts.
But lets address your actual post. As i remember there were people pointing to current games and proclaiming that we wouldn't need anymore, and this was before you started waving your banner about VRAM usage being hard tied to GPU horsepower. Talking about that.
vRAM of course is tied to performance. .
Ironic that you start your paragraph with a strawman.
vRAM of course is tied to performance. The purpose of vRAM is to hold data that the GPU can do calculations on, that's 100% the purpose of having vRAM. If the GPU doesn't need to do work on something that is in vRAM then it's pointless in that thing being loaded into vRAM it's just wasting space. If the GPU has to do calculations based on what is in vRAM then it makes perfect sense that as you put more assets into vRAM that you increase work load on the GPU and the performance goes down.
Pointless waffle
That's intuitively the case for pretty much all gamers. If you've ever played a game before that shows you estimated vRAM usage in the video settings menu, you'll see any time you increase visual fidelity by turning up settings, the estimated vRAM usage also goes up, and performance goes down. To say the 2 aren't hard tied together is some weird fantasy world, it's just not true in the slightest.
I would describe that as a soft tie not a hard tie. Since the change in vram and total VRAM consumption for certain settings can differ significantly between games/game engines.
However considering that my orginal statement was in reference to you stating that a 3080 can not use more than 10GB without being bottlenecked (Let me know if i have misconstrued your position). I consider this a tangent.
vRAM of course is tied to performance. The purpose of vRAM is to hold data that the GPU can do calculations on, that's 100% the purpose of having vRAM. If the GPU doesn't need to do work on something that is in vRAM then it's pointless in that thing being loaded into vRAM it's just wasting space. If the GPU has to do calculations based on what is in vRAM then it makes perfect sense that as you put more assets into vRAM that you increase work load on the GPU and the performance goes down.
That's intuitively the case for pretty much all gamers. If you've ever played a game before that shows you estimated vRAM usage in the video settings menu, you'll see any time you increase visual fidelity by turning up settings, the estimated vRAM usage also goes up, and performance goes down. To say the 2 aren't hard tied together is some weird fantasy world, it's just not true in the slightest.
The reality today is that with the GPUs we have and the work load they can sustain, we cannot make use of more than 10Gb. Any games that are close to using all 10Gb of a 3080 are slideshows, we demonstrated this quite conclusively in the "is 10Gb enough" thread of which had 100's of pages of examples. FS2020 uses about 9.5Gb at peak (4k ultra) measuring real usage, and even on a 3090 runs at like 25fps at those settings, same for Watch Dogs legion at 4k Ultra you can get real memory usage pretty close to 10Gb but performance is like 5fps or something really low. Avengers in 4k ultra is the same thing about 9Gb of memory used but I think 17fps if memory serves.
.
As has been mentioned and ignored. Textures have small effect on performance when compared to other settings but consume the most RAM. Difference in texture settings can also be the most noticeable depending on the game.
Most of the performance intensive effects have a smaller effect on VRAM consumption compared to Textures. (From Nvidias own specs guide that i posted).
You seem hung up on 4k, except screen resolution does not have a significant effect on VRAM usage. There maybe indirect effects suh as engines lowering settings when switching screen res, but that is on a game by game basis. It is therefore possible for a game to exceed 10GB of VRAM use at a lower resolution, were framerates may still exceed 60fps.
If VRAM usage is hard tied to performance as you are arguing why does, Doom eternal get nearly 200fps on a 3080 while using over 8GB of VRAM? Or is the game going to crater to sub 45fps once it hits the magical 10GB number.
This is a rhetorcial question. I'm certain that there are other games using above 8GB of VRAM and exhibiting very good performance (maybe not as good as Doom). It is clear that VRAM isn't hard tied to performance, or to word it another way, you cannot predict the performance of a game on a specific GPU by looking at VRAM usage (This is what hard tied means). Games using 10GB of VRAM or close to it aren't going to crater down the sub 45fps levels as you and others like to believe. It is going to depend on the game engine and the game.
Lets also not forget that these are computers and often people multi task with their PCs. *Awaits the drivel about how XYZ closes everything down when gaming*
I'm getting dizzy almost as if i'm going round in circles.
And more to the point the console APUs aren't very good, they're kinda like mid range video cards form last generation on PC, they cannot meaningfully make use of even 10Gb of vRAM, if you loaded the games up with 10Gb worth of assets then the frame rate would crawl to a halt..
You should go work for sony and MS and show them how to design a proper consoles. They clearly could have saved a ton of money by putting less RAM in but someone there didn't think of it. But if they had you on the team this would not have happened.
Edit: Feel free to continue this wthout me. I've put more than enough time in the previous thread on this matter.