• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
What do developers need more VRam for?



Typically MLID who has no idea what he's talking about tried to dumb it down and gets it completely wrong, the UE5 Dev polity explains it over and over again and MLID keeps trying to look clever..... :rolleyes:
 
Last edited:
Last edited:
What do developers need more VRam for?



Typically MLID who has no idea what he's talking about tried to dumb it down and gets it completely wrong, the UE5 Dev polity explains it over and over again and MLID keeps trying to look clever..... :rolleyes:

He goes into detail why texture sizes are growing:

Its quite clear the visual fidelity of landscape models,and increased detail character models is going to increase VRAM usage,especially with UE5. The consoles due to their fast SSDs,can leverage texture streaming more efficient than a lot of PCs.

This is why the RTX4060/RTX4060TI/RX7600XT if they only come with 8GB of VRAM need to be closer to £300. In a year or two,especially if we have console refreshes in the same time period might not bode well for "premium" 8GB VRAM dGPUs.

At least in my case,my RTX3060TI is getting onto two years old now,so at least its got some decent usage. But buying a £300+ 8GB VRAM dGPU in 2023 seems to be not a great idea IMHO.
 
Last edited:
What do developers need more VRam for?



Typically MLID who has no idea what he's talking about tried to dumb it down and gets it completely wrong, the UE5 Dev polity explains it over and over again and MLID keeps trying to look clever..... :rolleyes:

MLID has no business doing what he does, but people eat up his ****. Look at his work experience on LinkedIn, he has no business talking about pc hardware, game development or software.
 
Last edited:
I wouldn’t be playing that game on those settings on a mid range card.
Agreed as in the last of us @1440p Ultra settings in the chart only the 3080ti 12GB and above can maintain 60fps in the 1% lows and if I had any card below the 6750XT ( I have a rx 6600, nice card but I'm realistic with what the card is) I woud not play at these settings.
 
Agreed as in the last of us @1440p Ultra settings in the chart only the 3080ti 12GB and above can maintain 60fps in the 1% lows and if I had any card below the 6750XT ( I have a rx 6600, nice card but I'm realistic with what the card is) I woud not play at these settings.

The issue is Nvidia is still trying to sell £300+ 8GB cards in 2023(and it could be the RX7600XT from AMD is also 8GB). Considering that many people keep their dGPUs for at least a few years,I am not sure what the point of these dGPUs are? If we do get a PS5 refresh and an XBox refresh between late 2023 and late 2024,these cards will look like rubbish with newer games.
 
  • Like
Reactions: G J
He goes into detail why texture sizes are growing:

Its quite clear the visual fidelity of landscape models,and increased detail character models is going to increase VRAM usage,especially with UE5. The consoles due to their fast SSDs,can leverage texture streaming more efficient than a lot of PCs.

This is why the RTX4060/RTX4060TI/RX7600XT if they only come with 8GB of VRAM need to be closer to £300. In a year or two,especially if we have console refreshes in the same time period might not bode well for "premium" 8GB VRAM dGPUs.

At least in my case,my RTX3060TI is getting onto two years old now,so at least its got some decent usage. But buying a £300+ 8GB VRAM dGPU in 2023 seems to be not a great idea IMHO.

Yeah i saw that :)

If you want games that are ever prettier and more complex they will require more resources, we have been stuck on 8GB for far too long and maybe game devs are just sticking their fingers up at PC gamers because Consoles, ###### Game Console don't have this problem! So lets just make what we want to make and if you're stuck on an Nvidia GPU that's your bad choice problem. I honestly think that is their attitude now.

The R9 390 was $329 in 2015, again.... 2015, 8 years ago, a $300 GPU with 8GB in 2015.

Consoles, ##### Game Consoles are 16GB, they use at least 12 for the game.

GTX 1070: 8GB, 2016.
RTX 2070: 8GB, 2018.
RTX 3070: 8GB, 2020.

That's just pure planed obsolescence.

We talk a lot about both AMD and Nvidia not being your friend, and i hold to that, but at least AMD don't flog you expensive GPU's that they know will start to struggle to run the latest games even at 1080P just 2 to 3 years later, AMD would never get away with that and what narks me is Nvidia know they will because they are the ones with an army of white knights.

8GB is dead. we should have been on 16GB RTX 2070 by now.
 
Last edited:
MLID has no business doing what he does, but people eat up his ****. Look at his work experience on LinkedIn, he has no business talking about pc hardware, game development or software.

Yeah He doesn't, he's one of those people who thinks he know but in fact knows nothing.
 
Last edited:
Yeah i saw that :)

If you want games that are ever prettier and more complex they will require more resources, we have been stuck on 8GB for far too long and maybe game devs are just sticking their fingers up at PC gamers because Consoles, ###### Game Console don't have this problem! So lets just make what we want to make and if you're stuck on an Nvidia GPU that's your bad choice problem. I honestly think that is their attitude now.

The R9 390 was $329 in 2015, again.... 2015, 8 years ago, a $300 GPU with 8GB in 2015.

Consoles, ##### Game Consoles are 16GB, they use at least 12 for the game.

GTX 1070: 8GB, 2016.
RTX 2070: 8GB, 2018.
RTX 3070: 8GB, 2020.

That's just pure planed obsolescence.

We talk a lot about both AMD and Nvidia not being your friend, and i hold to that, but at least AMD don't flog you expensive GPU's that they know will start to struggle to run the latest games even at 1080P just 2 to 3 years later, AMD would never get away with that and what narks me is Nvidia know they will because they are the ones with an army of white knights.

8GB is dead. we should have been on 16GB RTX 2070 by now.

My main concern is within the next 18 months,let alone a few years!
 
My main concern is within the next 18 months,let alone a few years!

Oh its going to get glorious..... Games are going to get ever more beautiful, Reddit is going to get ever more packed with crying RTX 3070 and 3080 owners....
------------

Intel said "you only need 4 cores" the white knights parroted "you only need 4 cores"

Then 8 cores became mainstream and soon after games exploded in complexity, they got better, because they had more resources to work with.
 
Oh its going to get glorious..... Games are going to get ever more beautiful, Reddit is going to get ever more packed with crying RTX 3070 and 3080 owners....
------------

Intel said "you only need 4 cores" the white knights parroted "you only need 4 cores"

Then 8 cores became mainstream and soon after games exploded in complexity, they got better, because they had more resources to work with.

It would be ironic if the RTX3060 12GB ended up lasting longer than the RTX4060 8GB?! :cry:
 
The issue is Nvidia is still trying to sell £300+ 8GB cards in 2023(and it could be the RX7600XT from AMD is also 8GB). Considering that many people keep their dGPUs for at least a few years,I am not sure what the point of these dGPUs are? If we do get a PS5 refresh and an XBox refresh between late 2023 and late 2024,these cards will look like rubbish with newer games.
I'm not sure if this could be an issue with the upcoming 8GB cards as the low bus width on the 4060/4060ti/7600XT and supplementing it with cache to make up the shortfall. Time passes and game demands grow not only is the VRAM a weakness but also the memory bandwidth/cache could get overwhelmed and also cripple its performance.
 
Yeah looks like they won't age well such poor generation the lower down the stack we go

And the higher ones poorly priced
 
8GB is dead. we should have been on 16GB RTX 2070 by now.
Yeah, but I wonder how much of this is because Nvidia insists on using GDDR6X for desktop graphics cards instead of lower spec, cheaper VRAM (like AMD does).

I think either Nvidia doesn't want to admit that any advantage from 6X is small (clearly seen when comparing the RTX 3070 and 3070 TI), or they believe it's a good selling point (in terms of marketing RTX cards).

Consoles use 16GB of GDDR6, at 'just' 14 Gbps, no issues there (although they do use >192 bit memory buses).

As far as I know, Nvidia has only released 1 card with 12GB of GDDR6, which was the RTX 3060 12GB version:

I'd take 16GB of GDDR6 over 12/8GB of GDDR6X any day. 6X also has the problems of running significantly hotter and using more power, no doubt having an impact on the overall card design.
 
Last edited:
Personally I'd say 6X is required for higher end cards - regular GDDR6 on higher end stuff is going to rely heavily on cache performance and sooner or later that will be found wanting and requires more developer resources to make sure it covers real world use cases.

Consoles tend to rely heavily on caches in a way the PC does not and games/engines built for console tend to be optimised around that because it is always going to be that hardware configuration - which a PC game/engine can't rely on.

nVidia drivers also tend to be relatively bandwidth heavy both VRAM and system RAM especially where they've done optimisations in software like with DX11.
 
Last edited:
TLOU1 for whatever reason, instead of scanning your PC and finding out how much vram you have available,, it assigns a random number on it. The results? It supposedly requires 14GB of vram for 720p

720pultra.JPG.efbced5c6f7e1ff8c4cba1a75ffad536.JPG
 
Status
Not open for further replies.
Back
Top Bottom