• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

OcUK RTX4070 review thread

TBF it fits the trends. 10 series an absolute blinder of gen to gen performance, 20 series meh, 30 series an absolute blinder gen to gen, 40 series excluding the 4090 is meh gen to gen, if you go by perf to dollar gen to gen the 4090 re-enters at meh, because while it's an absolute monstrosity for performance you pay 50% more for 50% more so there's no gen to gen increase it's just more for more.
Agreed, 4090 the only smart choice...

image.png
 
I will be buying the 4070, I have a gtx1660 and its long time coming an upgrade so I can play again,

I might have considered the 7900xt but I cant find a model that runs cool and quiet ( all AMD stuff runs hot, )

It’s a good upgrade without a doubt, but plenty of XT’s run very cool and very quiet. Even the reference cooler is more than adequate on the XT.

But the efficiency of the 4070 is very impressive, the XT draws a lot more power, but then in many respects it’s in a class above, competing with the Ti and sometimes the 4080.
 
Last edited:
AMD's mistake is allowing software to read the hot spot temps, the sensor that is reading hottest, its a really odd and unnecessary move given that's what people concentrate on and panic when that goes over 80c while the edge temp, which is the only sensor Nvidia allow the software to read is barley over 60c.

Jokes have been made about AMD kicking themselves in the' or dropping the ball but really... YEAH!
"AMD runs hot"
AMD- Oh you think so? watch this...... there you go, now you have something to complain about and you can post actual data about it, you're welcome.
 
Last edited:
AMD's mistake is allowing software to read the hot spot temps, the sensor that is reading hottest, its a really odd and unnecessary move given that's what people concentrate on and panic when that goes over 80c while the edge temp, which is the only sensor Nvidia allow the software to read is barley over 60c.

Jokes have been made about AMD kicking themselves in the' or dropping the ball but really... YEAH!
"AMD runs hot"
AMD- Oh you think so? watch this...... there you go, now you have something to complain about and you can post actual data about it, you're welcome.

Its like... they do this and then spend the next 3 years explaining what it is and that its nothing to worry about.

No one is listening, that's not how this works, AMD. Don't you have people telling you "This is a really stupid idea" ?
 
Last edited:
Its like... they do this and then spend the next 3 years explaining what it is and that its nothing to worry about.

No one is listening, that's not how this works, AMD. Don't you have people telling you "This is a really stupid idea" ?
But yet, I in terms engineering - not how it looks - the hottest point is precisely what you should report. But these days engineering should always defer to marketing :(
 
https://www.techpowerup.com/review/sapphire-radeon-rx-7900-xt-pulse/35.html

Looks pretty good to me. Cards like the Nitro+ XTX are hardly what I'd call hot either, although sure, they're not as efficient as the 4000 series.

Any gpu running over 65C from techpowerup's reviews I dont consider to have a good cooler, and the pulse is running at 69.

Also thinking about it more, do I really need a 7900xt if im never going to run 4k. I would much rather get an OLED 1440p monitor than a 4k in future.

The 7800xt will probably be a better bet than 4070 but im fed up of waiting,

Im not impressed with the 4070 it is really cut down, but theres just nothing else at its price point. I dont feel the 6800xt is a competitor because the 4070 has far superior power efficiency
 
Any gpu running over 65C from techpowerup's reviews I dont consider to have a good cooler, and the pulse is running at 69.
That's an absurd line of thinking without also considering fan speed and noise output. I'd much prefer a more relaxed fan curve that lets the GPU run at 70 degrees (which is a perfectly fine temperature for a GPU to operate at) and be practically silent, rather than a needlessly aggressive fan curve that tries to keep the GPU at a stupidly low temperature for absolutely no reason. At best you might gain one speed bin via Nvidia's boost algorithm, equivalent to about 15MHz or 0.1fps, and if you really must have that you can set a more aggressive curve manually. The Founders Edition 4070 runs at 69 degrees according to TPU's review, so I guess that must be crap too and Nvidia obviously have no idea what they're doing with their own GPU.
 
Last edited:
Back
Top Bottom