• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
edit - wrong quote

only-fools-and-horses-delboy.gif
 
Possibly. From Nvidia, the performance uplift from the 3070 to 4070 just wasnt enough. For me the only Nvidia card that would have been worth it from a performance perspective was the 4080, but there is no way on earth I am paying £1,200 for a graphics card.

Its like..... if i wanted to buy a 3080 in 2020 for around £700 i would have bought a 3080 for around £700 in 2020, i didn't, so what makes you think i would to buy a 3080 under another name in 2023 for around £650?

Do you think i'm mad? Or just plain stupid?
 
I think the fact that at stock the TDP is just 200w, seemed like a warning sign in terms of performance. Even if the efficiency is great.

These cards probably could've had a TDP of ~260w, with higher boost clocks.
 
Last edited:
I think the fact that at stock the TDP is just 200w, seemed like a warning sign in terms of performance. Even if the efficiency is great.

These cards probably could've had a TDP of ~260w, with higher boost clocks.


Right, the RX 6800 is 223 watts board power consumption, compared to 201 watts. the 6800 has 2 more 32Bit lanes and memory IC's, that will acound for about 15 watts.

The 4070 is 16% faster, a whole 16%, it makes the 4070 about 25% more efficient, much less if you take in to consideration the memory architecture differences.

One is on 7nm, the other on 5nm, when you apply some critical thinking to it, its not great, i would not have apologized "but its really efficient" like some of these tech jurnos did, maybe if it was 150 watts, to me it should be closer to that on such an advanced node given its performance.
 
Last edited:
That power consumption though and just as summer is coming, bad deal IMO.

Power-Consumption.png
power-video-playback.png
power-video-playback.png

If you game on average 2 hours a day and watch videos 1 hour a day, assuming you keep the GPU for 2 years before upgrading then at current electricity prices a 6950 XT would cost you an extra £105 over a 4070.
 
Right, the RX 6800 is 223 watts board power consumption, compared to 201 watts. the 6800 has 2 more 32Bit lanes and memory IC's, that will acound for about 15 watts.

The 4070 is 16% faster, a whole 16%, it makes the 4070 about 25% more efficient, much less if you take in to consideration the memory architecture differences.

One is on 7nm, the other on 5nm, when you apply some critical thinking to it, its not great, i would not have apologized "but its really efficient" like some of these tech jurnos did, maybe if it was 150 watts, to me it should be closer to that on such an advanced node given its performance.

RX 6800 is 250W board power at stock.

If you are using the sensor reported power consumption it ignores memory.


Sustained furmark is measured at 254W, which is within margin of error of AMDs stated TDP.

When not CPU bound it will reach that.

edit:

I'm guessing you might be using the gaming average. Which is a fair comparison. But it does need to be the same setup between the 6800 and 4070 to be a comptely fair comparison.

I would look for more recent like for like tests. Might still be 25%.
 
Last edited:
RX 6800 is 250W board power at stock.

If you are using the sensor reported power consumption it ignores memory.


Sustained furmark is measured at 254W, which is within margin of error of AMDs stated TDP.

When not CPU bound it will reach that.

edit:

I'm guessing you might be using the gaming average. Which is a fair comparison. But it does need to be the same setup between the 6800 and 4070 to be a comptely fair comparison.

I would look for more recent like for like tests. Might still be 25%.
when people cry foul and reach straight for the power bug slides.....

Good job you edited that :D

Yes, gaming average, no one sits playing furmark all day, the RTX 4070 is 201 watts also average gaming.

TDP is meaningless, Intel's 220 watt CPU's have a 95 watt TDP.
 
Last edited:
when people cry foul and reach straight for the power bug slides.....

Good job you edited that :D

Yes, gaming average, no one sits playing furmark all day, the RTX 4070 is 201 watts also average gaming.

TDP is meaningless, Intel's 220 watt CPU's have a 95 watt TDP.

But it depends on the game used for that test, and also the CPU used. I think there are fairer more up to date comparisons out there.


Here you can see it is 35% more efficient than the RTX 3080 with RT Off. RDNA2 was something like 20% more efficient than Ampere.

So 0.65/0.8 is something like 20%.
 
But it depends on the game used for that test, and also the CPU used. I think there are fairer more up to date comparisons out there.


Here you can see it is 35% more efficient than the RTX 3080 with RT Off. RDNA2 was something like 20% more efficient than Ampere.

So 0.65/0.8 is something like 20%.

ffs.... slide wars.

Just stick to averages. Don't worry, Jenson doesn't care, he's too busy baking EVGA effigies to even notice you not defending him hard enough.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom