• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
No, but theres comes a point were you keep waiting and never buy anything!
Yep that's why i still have Vega 64. I wanted to buy ages ago but Covid crapped all over prices and they have not recovered. I keep going to buy but common sense kicks in to say don't support this crap and do i really need it. I need it lol but i don't need it at these prices until i break.
 
Just a honest question. What do you people consider a vram problem?

For example, if a card at 250€ or below has to turn down textures from day one, I assume that fine.

If a card at 500€ or below has to do it only at super heavy games (RT + full ultra settings),, I assume that's also fine??

But above that price point - when should it be okay to drop textures? 2 years later? 5 years later? Never? Is it only okay to drop every other setting but not textures?
 
Isn't that a price increase of $100 or +20% for a card that's only likely be 25% faster, to put that into context a 3070 was 50% faster than a 2070 for the same price.
Ιt's also a 50% increase in VRAM, which the 3070 did not get.


But regardless, that method of comparison is kinda flawed.. If ampere was a 200% increase over turing, does that mean that ADA would be a failure if it gave a 120% increase over ampere? Of course not.
 
Just a honest question. What do you people consider a vram problem?

For example, if a card at 250€ or below has to turn down textures from day one, I assume that fine.

If a card at 500€ or below has to do it only at super heavy games (RT + full ultra settings),, I assume that's also fine??

But above that price point - when should it be okay to drop textures? 2 years later? 5 years later? Never? Is it only okay to drop every other setting but not textures?

A VRAM problem in that the general enthusiast knows that 8Gb is not enough for decent res in modern games and is holding the rest of the card back. Nvidia have proved this themselves with the 3060 12GB out perfoming the 3070 8GBin modern titles above 1080p.
 
Last edited:
Ιt's also a 50% increase in VRAM, which the 3070 did not get.
Don't forget a 25% cut in Bus width and die size.

But regardless, that method of comparison is kinda flawed.. If ampere was a 200% increase over turing, does that mean that ADA would be a failure if it gave a 120% increase over ampere? Of course not.
Even looking at this generation the 4090 was 70% faster than a 3090 for +$100 so for only a +25% increase of the 4070 then I would deem that a failure as the gap in performance between that and the top card was more than doubled while the price gap has remained unchanged.
 
Don't forget a 25% cut in Bus width and die size.


Even looking at this generation the 4090 was 70% faster than a 3090 for +$100 so for only a +25% increase of the 4070 then I would deem that a failure as the gap in performance between that and the top card was more than doubled while the price gap has remained unchanged.
The die size is irrelevant. They moved to a different node. If the 4070 was on the 8nm samsung node it would actually be much bigger than the 3090ti :)
 
Ιt's also a 50% increase in VRAM, which the 3070 did not get.


But regardless, that method of comparison is kinda flawed.. If ampere was a 200% increase over turing, does that mean that ADA would be a failure if it gave a 120% increase over ampere? Of course not.

Increases should be expected as tech advances and time move one. Price and VRAM has increased but memory bus has shrunk considerably. Its obvious you love Nvidia but you can't keep blindly defending them
 
Im defending them by quoting....facts? Okay :D
Just a question... imagine AMD had given up on the desktop market and never launched the 7900 series. And for good measure, Intel decided not to go ahead with the B series. Nvidia is the only game in town and is only competing with its own previous generation.

Would you consider the 4070TI and 4070 to be well priced and well specified?

I'm genuinely interested because I don't. I expect my other half will end up with a 4070 to replace her 3060TI but even at £600, if it hits that price, it still feels either overpriced or underspecified to me.
 
Just a question... imagine AMD had given up on the desktop market and never launched the 7900 series. And for good measure, Intel decided not to go ahead with the B series. Nvidia is the only game in town and is only competing with its own previous generation.

Would you consider the 4070TI and 4070 to be well priced and well specified?

I'm genuinely interested because I don't. I expect my other half will end up with a 4070 to replace her 3060TI but even at £600, if it hits that price, it still feels either overpriced or underspecified to me.
Of course not. I've never considered any of the current cards to be well priced regardless. My opinion was always that - the 4070ti is the best valued card - based on current market prices. Not in a vacuum. In a vacuum, it's atrociously priced. I just never understood all the complaints about the 70ti when eg. the 7900xt was worse priced.

Now that the 7900xt is much cheaper, it's hands down a better card. Although I would still skip them both and go for the 4080. I can't sacrifice DLSS sadly. But if someone doesn't mind not having DLSS, the 7900xt is good.
 
Last edited:
The die size is irrelevant. They moved to a different node. If the 4070 was on the 8nm samsung node it would actually be much bigger than the 3090ti :)
Clearly it wouldn't as a 4070 it don't even match a 3090ti.

Infact the XX70 has normally delivered previous gen halo performance yet this card looks to have a deficit to 25% to the 3090ti so I'd consider that a big failure especially when you consider the cutting edge node being used.
 
Last edited:
Increases should be expected as tech advances and time move one. Price and VRAM has increased but memory bus has shrunk considerably. Its obvious you love Nvidia but you can't keep blindly defending them
Not really, historically tech has always comes down in price. Adjusted for inflation Intel's 4004 would cost something like $400 today. If prices had increased as tech advanced we'd need the GDP of a small country to buy a GPU/CPU.

If transistors still cost what they cost back then (more than 1 cent each afaik) a 4090 would cost something like $80 billion.
 
It depends on the title. Worse case it’s around 3080 levels, but it’s also comparable to the 3090ti in some other games. I wouldn’t call its RT capabilities very limited in that sense. In UE5 with Lumen the XTX is as fast as a 4080, and if plenty of games move forwards with UE5 those titles should play very well.
Well it is limited if you see the whole picture, the 4000s are superior not only because they perform better in RT but because they can use DLSS+FG with RT and that is a game changer. Lumen is not as demanding and isn't looking as good as RT that we have seen in other games, and still the 4080 is faster in that.
 
Well it is limited if you see the whole picture, the 4000s are superior not only because they perform better in RT but because they can use DLSS+FG with RT and that is a game changer. Lumen is not as demanding and isn't looking as good as RT that we have seen in other games, and still the 4080 is faster in that.

XTX looks strong with Lumen, comparable if not faster than the 4080 according to some reviews I’ve seen. If the uptake for UE5 is strong it could be a good indicator moving forwards. Nvidia are obviously starting to pushing full path tracing but even the 4090 can’t manage it without all the tech behind DLSS3.

 
Status
Not open for further replies.
Back
Top Bottom