• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HUB: Intel 13700K vs AMD 7800X3D

Caporegime
Joined
17 Mar 2012
Posts
49,563
Location
ARC-L1, Stanton System
Hardware Unboxed gaming CPU comparison.

Its changed somewhat since the initial review with more games now using the 3D Cache the result are better than those initial reviews.

Being priced quite similar here, on the day of posting this, £400 for the 13700K and £429 for the 7800X3D, the results are quite surprising, 11% overall but quite a few games where the 7800X3D is 15% or more faster.

The 13700K also use quite a lot more power, no really, this is total system consumption with a 4090 so for a CPU to use over 100 watts more power in these games is quite staggering, the 13700K is using 2X to 3X more power than the 7800X3D while gaming, the most was 196 watt more, with todays electricity prices you will notice that.

zXv2TYf.png


9phU9y1.png


 
Last edited:
I will admit the 200 watt over 7800X3D power consumption does look a bit odd, another at 150 watts.

The 13700K is also running 7200MT/s RAM, that's quite high, both the memory sticks and the IMC on the 13700K will be drawing a lot more power than if it was running at the stock 5600MT/s.

The 7800X3D sips power, but i'm suspect of the 13700K drawing more than 100 watts over it.

With that said lets move on :)
 
Last edited:
this is even more remarkable if you consider 7800X3D is making 4090 work 10-15% harder. GPU is doing more work, using more power and delta is still massively in AMD favor

On reflection i think on this occasion @Bencher has a point, the 13700K would only draw that much power if it was fully loaded up, it pulls up to about 260 watts in heavy threaded workloads, with the memory also being so heavily overclocked it would exaggerate that delta, a very overclocked IMC can draw a substantial amount of power.
 
I will say also this, shader compiling and overclocked memory or not a power draw up around 300 watts or more is utterly ridiculous, especially in this climate and this seems to be ignored, Bulldozer was relentlessly ridiculed for pulling 140 watts at most in heavy MT while never being more than 30% slower in games than the same generation i7 pulling 95 watts.

This is WAY beyond that.
 
Last edited:
The card is at 98% usage man...it has nothing to do with the cpu, else dlss wouldn't help would it?

Frame chasers is irrelevant here last time we tested a stock 12900k had better lows than your 7950x 3d.

What? that's everything to do with the CPU, it means the CPU is fast enough to drive the GPU to its fullest extent, if you're getting less than thatwith a similar performance GPU its because you have a CPU bottleneck.
 
But you didn't seem to like the answer. I can post you a video in warzone 2 and you can see that it really uses those ecores instead of HT when you have them on.

Your answer was no.
I can show you the way, but you may not like the answer... :cool:
Step 1 - Never, I repeat never, listen to Frame Chasers on anything AMD. :p
Step 2 - Buy a 7950X3D, and tune memory, FCLK and curve optimiser. Cheap Hynix M die is all you need, 5200Mhz speed or higher will do. No need to overclock the CPU using BCLK, stock will do. ;)
Step 3 - Result!
l3y4R4h.gif

To add to this.

Step 4 - Never, I repeat never, listen to Frame Chasers on anything Intel.
:p
 
Isn’t intel rumoured to be getting another cpu on the LGA1700?

would be interesting to see the power difference if intel were on TSMC 5nm as everyone thought AMDs gpu architecture was much more efficient than Nvidia when they had the node advantage but then when Nvidia jumped to the superior node the tables turned quite dramatically.

Intel are supposed to be promoting their own fabs, is this them admitting they can't even get their own products to work efficiently when built in their own fabs?
Also, Intel can't turn a profit on their CPU's when they build them themselves, hows that going to work when the pay TSMC to fabricate them? Do they think they can ramp the price up?
 
Last edited:
Back
Top Bottom