Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Its an unoptimised mess though, I wouldn't put too much value on one or two games.
What do developers need more VRam for?
Death of 8GB GPUs, RX 7600 XT VRAM, AI Taking Jobs, Ray Tracing | UE5 Developer | Broken Silicon 199
An Unreal Engine 5 dev joins to discuss what’s happening with VRAM usage in modern games.[SPON: Get 6% OFF Custom PCs & GPUs at https://www.silverknightpcs.c...youtu.be
Typically MLID who has no idea what he's talking about tried to dumb it down and gets it completely wrong, the UE5 Dev polity explains it over and over again and MLID keeps trying to look clever.....
What do developers need more VRam for?
Death of 8GB GPUs, RX 7600 XT VRAM, AI Taking Jobs, Ray Tracing | UE5 Developer | Broken Silicon 199
An Unreal Engine 5 dev joins to discuss what’s happening with VRAM usage in modern games.[SPON: Get 6% OFF Custom PCs & GPUs at https://www.silverknightpcs.c...youtu.be
Typically MLID who has no idea what he's talking about tried to dumb it down and gets it completely wrong, the UE5 Dev polity explains it over and over again and MLID keeps trying to look clever.....
Agreed as in the last of us @1440p Ultra settings in the chart only the 3080ti 12GB and above can maintain 60fps in the 1% lows and if I had any card below the 6750XT ( I have a rx 6600, nice card but I'm realistic with what the card is) I woud not play at these settings.I wouldn’t be playing that game on those settings on a mid range card.
Agreed as in the last of us @1440p Ultra settings in the chart only the 3080ti 12GB and above can maintain 60fps in the 1% lows and if I had any card below the 6750XT ( I have a rx 6600, nice card but I'm realistic with what the card is) I woud not play at these settings.
He goes into detail why texture sizes are growing:
Death of 8GB GPUs, RX 7600 XT VRAM, AI Taking Jobs, Ray Tracing | UE5 Developer | Broken Silicon 199
An Unreal Engine 5 dev joins to discuss what’s happening with VRAM usage in modern games.[SPON: Get 6% OFF Custom PCs & GPUs at https://www.silverknightpcs.c...youtu.be
Its quite clear the visual fidelity of landscape models,and increased detail character models is going to increase VRAM usage,especially with UE5. The consoles due to their fast SSDs,can leverage texture streaming more efficient than a lot of PCs.
This is why the RTX4060/RTX4060TI/RX7600XT if they only come with 8GB of VRAM need to be closer to £300. In a year or two,especially if we have console refreshes in the same time period might not bode well for "premium" 8GB VRAM dGPUs.
At least in my case,my RTX3060TI is getting onto two years old now,so at least its got some decent usage. But buying a £300+ 8GB VRAM dGPU in 2023 seems to be not a great idea IMHO.
MLID has no business doing what he does, but people eat up his ****. Look at his work experience on LinkedIn, he has no business talking about pc hardware, game development or software.
Yeah i saw that
If you want games that are ever prettier and more complex they will require more resources, we have been stuck on 8GB for far too long and maybe game devs are just sticking their fingers up at PC gamers because Consoles, ###### Game Console don't have this problem! So lets just make what we want to make and if you're stuck on an Nvidia GPU that's your bad choice problem. I honestly think that is their attitude now.
The R9 390 was $329 in 2015, again.... 2015, 8 years ago, a $300 GPU with 8GB in 2015.
Consoles, ##### Game Consoles are 16GB, they use at least 12 for the game.
GTX 1070: 8GB, 2016.
RTX 2070: 8GB, 2018.
RTX 3070: 8GB, 2020.
That's just pure planed obsolescence.
We talk a lot about both AMD and Nvidia not being your friend, and i hold to that, but at least AMD don't flog you expensive GPU's that they know will start to struggle to run the latest games even at 1080P just 2 to 3 years later, AMD would never get away with that and what narks me is Nvidia know they will because they are the ones with an army of white knights.
8GB is dead. we should have been on 16GB RTX 2070 by now.
My main concern is within the next 18 months,let alone a few years!
Oh its going to get glorious..... Games are going to get ever more beautiful, Reddit is going to get ever more packed with crying RTX 3070 and 3080 owners....
------------
Intel said "you only need 4 cores" the white knights parroted "you only need 4 cores"
Then 8 cores became mainstream and soon after games exploded in complexity, they got better, because they had more resources to work with.
It would be ironic if the RTX3060 12GB ended up lasting longer than the RTX4060 8GB?!
I'm not sure if this could be an issue with the upcoming 8GB cards as the low bus width on the 4060/4060ti/7600XT and supplementing it with cache to make up the shortfall. Time passes and game demands grow not only is the VRAM a weakness but also the memory bandwidth/cache could get overwhelmed and also cripple its performance.The issue is Nvidia is still trying to sell £300+ 8GB cards in 2023(and it could be the RX7600XT from AMD is also 8GB). Considering that many people keep their dGPUs for at least a few years,I am not sure what the point of these dGPUs are? If we do get a PS5 refresh and an XBox refresh between late 2023 and late 2024,these cards will look like rubbish with newer games.
Yeah, but I wonder how much of this is because Nvidia insists on using GDDR6X for desktop graphics cards instead of lower spec, cheaper VRAM (like AMD does).8GB is dead. we should have been on 16GB RTX 2070 by now.
The difference is pretty small, comparing GDDR6 with upto 20 Gbps (for the RX 7900 XT/XTX) and 21 Gbps for GDDR6X (e.g. the RTX 4090).Personally I'd say 6X is required for higher end cards