Not sure who'd they'd blame if they (AMD) just stopped making consumer cards, Nvidia maybe... nah you can't blame the company who choses not to do something that would benefit the consumer and not the profit making corporation.
Arguing about putting extra $30 VRAM on a $700 card is absolutely hilarious, 4-5% extra cost for what might mean a lifespan that is 20% longer. Also that is 8GB extra not 4GB, so really it is like 2.5%
I do love reading the comments as well, its almost as funny as anti-social media sometimes
If we are referring to now with 40xx and going forward? Completely agree.
But as noted before, it simply wasn't possible back when the 3080 and 3090 released.
If you want proof of what happens when there is lack of competition and then very good competition, just look at the cpu market..... Whether people like it or not, we are ultimately in this boat because of there being little to no competition in the gpu market, simply under pricing competitor products by £50-100 on launch with only having more vram and significantly better perf in 1-3 raster based games such as COD when you are severly lacking in a number of other ways is not the way to be competitive.
HUB was reporting as well basically Vram needs to match the current generation of consoles was their conclusion or you're going to run into a shortage because thats the baseline for game devs. Anyone arguing anything else is just ******* in the wind due to sheer bloody obstinacy. Nvidia do it for reasons of planned obsolescence amongst other things force people onto new genarations of cards especially at the lower end.
They've covered it in the past as well:
![]()
16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit
Hetzner: http://hetzner.com/hub/ex-lineSupport us on Patreon: https://www.patreon.com/hardwareunboxedJoin us on Floatplane: https://www.floatplane.com/channe...www.youtube.com
Don't disagree, although it's such a stupid thing to "only" point fingers at nvidia for stinging on vram when games are obviously released in **** state, only for them to get "fixed" and magically work well on lower vram gpus......
Pc gamers always complain about how **** games run on pc and need upscaling etc. and think the only way to do things is simply to brute force so as to overcome or not face such issues. This is why DF are by far the best in the business, they point out such flaws which ultimately shows the issues with games and forces/helps devs to solve issues, again TLOU being the perfect example. I dread to imagine what games would be like if if it weren't for Alex and his in depth insight as to what causes the perf. issues.
Essentially yes point fingers at nvidia and give them **** for it now but the main culprits are game devs. The fact that bang4buck on hogwarts still notices textures loading in further proves that even with all the vram in the world, the game still exhibits issues which are usually associated with running out of vram and it's obviously not the gpu/vram, it's because of how the devs have designed the game.
It's always a silly argument these days "planned obsolescence", this goes for basically every company and products out there (even amd), capitalism is a bitch.
Last edited: