• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
A VRAM problem in that the general enthusiast knows that 8Gb is not enough for decent res in modern games and is holding the rest of the card back. Nvidia have proved this themselves with the 3060 12GB out perfoming the 3070 8GBin modern titles above 1080p.
I with you on this but in general when this does happen the 3060 and 3070 cant get 60fps anyway so why would anyone really play at these settings.
Increases should be expected as tech advances and time move one. Price and VRAM has increased but memory bus has shrunk considerably. Its obvious you love Nvidia but you can't keep blindly defending them
Yep and I agree with you but its a shame that this doesnt get applied to game devs holding them to a similar standard when games get pushed out in a bad state.

On the subject of blindly defending why are people not calling out the blindly attacking?
 
Even Nvidia admits vram is a problem for them - look at the last line.

At least they cut the price, the 4070 is $599



Ooh, the cats out the bag! not that the cat was ever still in the bag really...it's woken up, yawned, stretched it's legs, had its breakfast, and is out on cat-patrol round the neighbourhood at this point...

:cry:
 
XTX looks strong with Lumen, comparable if not faster than the 4080 according to some reviews I’ve seen. If the uptake for UE5 is strong it could be a good indicator moving forwards. Nvidia are obviously starting to pushing full path tracing but even the 4090 can’t manage it without all the tech behind DLSS3.

I believe this benchmark use software accelerated RT, because if you enable Lumen with HW RT the 4080 is ahead.
 
Lumen is not as demanding and isn't looking as good as RT that we have seen in other games, and still the 4080 is faster in that.
Screenshot-514.png
 
Unreal engine 5 is not a magic hack, its still extremely hardware demanding regardless of what epic games' marketing want you to believe
 
Last edited:
I think the RTX 4070 would be better if priced around the £600 mark. Nvidia really should be doing 16 GB cards now at around the £600 mark. Hopefully they wake up soon.
 
I think the RTX 4070 would be better if priced around the £600 mark. Nvidia really should be doing 16 GB cards now at around the £600 mark. Hopefully they wake up soon.

It's a cut-down card again, and of course NV are still cutting corners with VRAM, it's been their MO for years. If you're spending over £500 now the minimum VRAM should be 16GB, not 12GB. And this 4070 is a £500 card - at best.
 
Im defending them by quoting....facts? Okay :D
Not really, historically tech has always comes down in price. Adjusted for inflation Intel's 4004 would cost something like $400 today. If prices had increased as tech advanced we'd need the GDP of a small country to buy a GPU/CPU.

If transistors still cost what they cost back then (more than 1 cent each afaik) a 4090 would cost something like $80 billion.

Yeah thats exactly my point. Bencher seemed to be justifying the price increased due to the extra VRAM.

@Murphy just to clarify I meant increases in VRAM, not cost :)
 
Last edited:
For a rumoured 25-30% faster than my current GPU I'd really not be bothered :s it would have to be more like £450 max for me to even consider it. EDIT: In fact probably less than that.
 
Last edited:
Clearly it wouldn't as a 4070 it don't even match a 3090ti.
What do you mean clearly it wouldn't? It's math man, multiply the density with the transistor count and there you have it, the 4070 on the 8nm node would be more than 750mm2 in die size, much bigger than the 3090ti.
 
For a rumoured 25-30% faster than my current GPU I'd really not be bothered :s it would have to be more like £450 max for me to even consider it. EDIT: In fact probably less than that.
Yup, my thoughts on it too. Either significantly faster or significantly more cost appropriate and it doesn't look like it's either.

I'm not struggling too much with my 3070FE, would have been nice to upgrade but I think this is increasingly appearing to be a generation worth skipping entirely.
 
Yup, my thoughts on it too. Either significantly faster or significantly more cost appropriate and it doesn't look like it's either.

I'm not struggling too much with my 3070FE, would have been nice to upgrade but I think this is increasingly appearing to be a generation worth skipping entirely.

Only reason I'm even thinking about replacing the 3070FE is that I do have a 43" 4K HDR display I use for some casual gaming, etc. normally I play at 1440p and these days mostly playing older games still.

With some tweaking Hogwarts Legacy actually plays quite well at 1440p on my 3070 laptop even (unless they patch it to fix it the ray tracing implementation in the game just isn't worth bothering with) - though if they released proper path traced versions of some of these recent games I might be more interested in a 4090, etc.
 
Last edited:
What do you mean clearly it wouldn't? It's math man, multiply the density with the transistor count and there you have it, the 4070 on the 8nm node would be more than 750mm2 in die size, much bigger than the 3090ti.
Well the ADA architecture must really suck then considering its a seeing a 25% performance deficit to a 628mm ampere chip and would be even worse at the same clock speeds.
 
Well the ADA architecture must really suck then considering its a seeing a 25% performance deficit to a 628mm ampere chip and would be even worse at the same clock speeds.
Sure. But in your previous post you seemed to suggest that transistor count should play a role on pricing, in which case it makes sense for the 4070 to be more expensive than the 3070, right? :D
 
It's a cut-down card again, and of course NV are still cutting corners with VRAM, it's been their MO for years. If you're spending over £500 now the minimum VRAM should be 16GB, not 12GB. And this 4070 is a £500 card - at best.
£500 at best, yes but Nvidia will not go that low.
 
I think nvidia should give us 24gb of vram on the 4050 cause developers are uber lazy. Makes sense for everyone :D

Optimization? Nah, just buy more vram
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom