• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Prices already are coming down, compared to what else is on the market the 4070 actually offers a good per frame cost, not good historically but still. Nvidia and AMD may want to keep prices, and profits, high however if people ain't buying what they're selling they have no choice but to lower prices.
The 4080's have a £200 discount waiting to be priced in which will bring them to the £1000 that Nvidia wants to price them at. I think AMD will want to keep that £200 difference so that will remain, 7900XTX will be £800. And please don't think that Nvidia are being charitable here with the 4070 performance, it might be priced OK for the performance but not for the chip. It's a 60 class chip at more than 70 class pricing (they have raised the prices of 4070 class GPU's as well)
 
The 4080's have a £200 discount waiting to be priced in which will bring them to the £1000 that Nvidia wants to price them at. I think AMD will want to keep that £200 difference so that will remain, 7900XTX will be £800. And please don't think that Nvidia are being charitable here with the 4070 performance, it might be priced OK for the performance but not for the chip. It's a 60 class chip at more than 70 class pricing (they have raised the prices of 4070 class GPU's as well)
Only 227mm2 of the 295mm2 AD104 is being used for the 4070 which works out at even less than the 3060 12gb which had 256mm2 out of the 276mm2 die in use so technically your getting an ADA card with a lower die spec than the 3060 12gb.

From Turing even a GTX1650 super had a larger die area than a 4070 in use.
 
Only 227mm2 of the 295mm2 AD104 is being used for the 4070 which works out at even less than the 3060 12gb which had 256mm2 out of the 276mm2 die in use so technically your getting an ADA card with a lower die spec than the 3060 12gb.

From Turing even a GTX1650 super had a larger die area than a 4070 in use.

They've truly done us the dirty.
 
In terms of performance yes, but in terms of efficiency, the 40 series is genuinely considerably better than 30 series (and 20 series). A 4080 draws less power than a 3080 for example, it's 100W or more in power savings, runs quieter, runs cooler. Some will say you can undervolt a 3080, that's true, but then you lose a percentage of performance, even if it's 3-6fps, and you still don't get the low power draw of the 40 series.

So for me it would be knowing that now the 40 series and upcoming 50 series will be very efficient, and all it boils down to is the price, since the performance of all of them appear to be good for their class, just the pricing is all wrong which we know anyway.
It’s so true.

My 4070 Ti, runs cooler and draws less power with great performance. My previous heaters were not great in my SFF build (3080 FE and MBA 6950 XT).
 
In terms of performance yes, but in terms of efficiency, the 40 series is genuinely considerably better than 30 series (and 20 series). A 4080 draws less power than a 3080 for example, it's 100W or more in power savings, runs quieter, runs cooler. Some will say you can undervolt a 3080, that's true, but then you lose a percentage of performance, even if it's 3-6fps, and you still don't get the low power draw of the 40 series.

So for me it would be knowing that now the 40 series and upcoming 50 series will be very efficient, and all it boils down to is the price, since the performance of all of them appear to be good for their class, just the pricing is all wrong which we know anyway.
The energy savings are exaggerated because other than the 4090 all the other cards are using the dies from a tier below their 3000 series counterparts.

The 4090 on the other hand uses a similar size die and has similar power consumption to a 3090 but the extra energy is going into performance which is why this card leads the 3090 by a much larger % than any of the other 40 series do their 3000 equivalents.
 
The energy savings are exaggerated because other than the 4090 all the other cards are using the dies from a tier below their 3000 series counterparts.

The 4090 on the other hand uses a similar size die and has similar power consumption to a 3090 but the extra energy is going into performance which is why this card leads the 3090 by a much larger % than any of the other 40 series do their 3000 equivalents.

I raised the energy consumption of cards once and was laughed at. Who cares about energy costs for a gpu? Yet now it seems to be the main reason people want a 4070. I kinda think they are just looking for a reason to get their two yearly NVIDIA fix.
 
The energy savings are exaggerated because other than the 4090 all the other cards are using the dies from a tier below their 3000 series counterparts.

The 4090 on the other hand uses a similar size die and has similar power consumption to a 3090 but the extra energy is going into performance which is why this card leads the 3090 by a much larger % than any of the other 40 series do their 3000 equivalents.

This is what people seem to not get and Nvidia keeps doing it. This is what happened with Kepler too. The GTX680 was basically a GTX560TI replacement upsold,because AMD's "top end" HD7970 was not only underclocked but was well under 400MM2. But everyone was going on about the lower power draw compared to the GTX580. The GTX780/GTX780TI were the true replacements,and the Titan was Nvidia using the window of exclusivity to double pricing(that ended when AMD Hawaii was soon to be released). Then the same happened with the GTX980 and people going on about power draw against the GTX780TI. But the true replacement was the GTX980TI.It then also happened with Pascal. The GTX1080 was not the true replacement for the GTX980TI,but people were going on about the power draw. The true replacement was the GTX1080TI.

Turing and Ampere at least had Nvidia release the top end models at the same time. But sadly with Ada Lovelace,the rejigged the lower tiers upwards! :(
 
Last edited:
I raised the energy consumption of cards once and was laughed at. Who cares about energy costs for a gpu? Yet now it seems to be the main reason people want a 4070. I kinda think they are just looking for a reason to get their two yearly NVIDIA fix.
Energy in = heat out. That's usually my reason, that said I wouldn't just buy something because it was lower powered. I did run 7970 GHz Crossfire and that used to cook me! 300W max more or less now.
 
Yes heat and noise. The fact is the 40 series as a whole uses less power, thus generates less heat and run quieter because the fans don't need to spin as high. Average rpm seems to be 1500rpm for most of the range, whilst my 3080 ti, with a power limit set, still spins at over 1700rpm whilst the core temp is at 79-81 degrees depending on game and drawing ~301 watts.. In the same instance, as an example, a 4080 is 65 degrees drawing around 250 watts. Those are numbers that people are showing in their RTSS overlays playing at similar settings on the same games that I can compare against on my end with the 3080 Ti.

Obviously a frame cap solves the issue in most games, but that only applies on 30 series if your frame cap is below 100fps, once you go above that for a cap, the GPU is stressed more than the equivalent 40 series, so just ends up being hot and loud again. 40 series doesn't need any power limit or undervolting or frame caps, it's just quiet, cool and fast out of the box. That's the bigger selling point IMO for those of us who appreciate power but without the noise and heat that comes with it.
 
Last edited:
The RTX4070 is basically 44% faster than an RTX3060TI for 60% more money. The RTX3060TI was 40% faster than an RTX2060 Super for similar money(a tiny power increase from 184W to 200W). The RTX2060 Super was around 35% faster than a GTX1070 for slightly more money(went from 200W from 145W). The GTX1070 was around 61% faster than a GTX970 4GB for slightly more money. The GTX970 was 40% faster than the GTX770 for slight less money. Up until the RTX4070,all these dGPUs were around $300~$400.

As a person who has a 12.7 litre SFF PC(NCase M1),I don't buy all the excuse making for paying 60% more for 44% more performance,just to save on "power consumption" and heat. It's an important consideration but if people cared so much about power consumption,then most last generation would have gotten an RX6600XT or the RX6800. The RX6600XT can pushed down to 95W because it is essentially using an overvolted mobile GPU! But the RTX3060TI despite consuming more power,was a better buy for me at the time.

Radeon Chill existed for years before Nvidia bothered to do something similar. But I remember during the GTX200 series when Nvidia was better power than the HD4000 series,people justified that as a reason to buy one. But when AMD won convincingly with the HD5000 series,suddenly it wasn't important anymore.

Sounds more like people rationalising a tech company trying a fast one,just like all the people saying they were "holding their iPhone wrong" just because Apple screwed up. People have to get out of the FOMO mindset - just because companies release new products does not you need to buy one at any price.
 
Last edited:
Where are you getting that 40% figure from?

TPU - actually works out about 44% at qHD. If you look through all their reviews,35% to 45% was about what you got in the other examples,but the price stayed between $330~$400 roughly.

Techspot/Hardware Unboxed are much less charitable:

They arrive at 24% for their test suite,but TPU also includes older ones too.

Yet the RTX3060TI FE was £360~£370. It's a Turing V1 level of stagnation.

Its even worse when you compare it directly with an RTX3070,which is around 15% faster than an RTX3060TI.
 
Last edited:
TPU - actually works out about 44% at qHD. If you look through all their reviews,35% to 45% was about what you got in the other examples,but the price stayed between $330~$400 roughly.

Techspot/Hardware Unboxed are much less charitable:

They arrive at 24% for their test suite,but TPU also includes older ones too.

Yet the RTX3060TI FE was £360~£370. It's a Turing V1 level of stagnation.

Its even worse when you compare it directly with an RTX3070,which is around 15% faster than an RTX3060TI.
Why is it 44% on the 4070 review but 54% on the 3060 Ti page?

 
Last edited:
Why is it 44% on the 4070 review but 154% relative performance on the 3060 Ti page?


Also your maths is off, the fake MSRP of the 3060 Ti was $399 so the 4070 is 50% more expensive not 60%.

Everyone I knew got one at RRP. Stop rationalising price increases. Not even reviewers agree this generation is great. Not a single person I knew paid more than RRP for a dGPU.

The UK price was £369,and the RTX4070 price is £589. That is 60% so my maths is fine.
 
Last edited:
The price is set in USD, using GBP isn't a good comparison due to currency fluctuations.

That is not my issue,and still excusing making for a 50% to 60% price hike,when you don't even get that level of performance increase? Sounds worse than Apple.
Lets face it how many people who just bought a 4070 for £600 would have done so had it been called a 4060 12gb.

Exactly and they are so emtionally entangled with these companies they defend all these moves. Told you PCMR are pretty weak willed.
 
That is not my issue,and still excusing making for a 50% to 60% price hike,when you don't even get that level of performance increase? Sounds worse than Apple.
I'm not excusing the rubbish prices, just pointing out that the GBP price is not a fair comparison. It's like saying the price increase is 89% because you bought them with Bitcoin, it's just not accurate because the price is set in USD.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom