• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

14th Gen "Raptor Lake Refresh"

At the highend, the 7950x3d beats the 13900k/ks in games, while consuming half the power. IMO that's not competitive, especially given the price of electricity these days. Zen4x3d = obvious choice.

At the low/medium end, Intel is currently better value, due to DDR4, z690 motherboard prices and core counts of the i5.
At the highend, the 13900k beats the 3d in productivity while consuming half the power (autocad photoshop premiere etc. That's not competitive
 
At the highend, the 13900k beats the 3d in productivity while consuming half the power (autocad photoshop premiere etc. That's not competitive
TPU show in Photoshop and Premier Pro at least the 7950X3D is drawing less power vs the 13900K, though i agree it appears to be a bit slower there.

7950X3D



13900K



EDIT This shows it better a direct comparison
TyYgNVZ.png

I was shocked to see the 7700X is faster than both of them in Photoshop though wtf. :cry:
 
Last edited:
At the highend, the 7950x3d beats the 13900k/ks in games, while consuming half the power. IMO that's not competitive, especially given the price of electricity these days. Zen4x3d = obvious choice.

At the low/medium end, Intel is currently better value, due to DDR4, z690 motherboard prices and core counts of the i5.
In games the x3d is superior due to the power consumption, but productivity 13th gen wins because of the core count
 
Whose benches are you using?
Technotice, he is basically an expert in content creation workloads and he has done multiple videos comparing zen 4 with the 13th gen.



 
My system is on 24/7 so that’s one of the reasons I use adaptive since most of the time it’s idle. My 4090 also consumes a lot less idle wattage than my 3090 before it so that was a nice side benefit.

I like that he showed actual running costs for a year since that’s a nonsense argument used repeatedly.
 
My system is on 24/7 so that’s one of the reasons I use adaptive since most of the time it’s idle. My 4090 also consumes a lot less idle wattage than my 3090 before it so that was a nice side benefit.

I like that he showed actual running costs for a year since that’s a nonsense argument used repeatedly.
If you go to around the 20 minute mark in the 1st video where he measures power from the wall, look at ryzens hwinfo stats. They are scary. The 13900k draws as little as 8w while actually running workloads, zen's package power was like 50+. Holy cow. It ended up using 20% more power while being...slower. YIKES
 
Last edited:
If you go to around the 20 minute mark in the 1st video where he measures power from the wall, look at ryzens hwinfo stats. They are scary. The 13900k draws as little as 8w while actually running workloads, zen's package power was like 50+. Holy cow

Well if the platform idles a lot higher then it’s naturally not going dip below that ceiling. I posted running results of power draw a while back showing a bunch of things open/active while I was around 11w. Crickets.
 
TPU show in Photoshop and Premier Pro at least the 7950X3D is drawing less power vs the 13900K, though i agree it appears to be a bit slower there.

7950X3D



13900K



EDIT This shows it better a direct comparison
TyYgNVZ.png

I was shocked to see the 7700X is faster than both of them in Photoshop though wtf. :cry:
Technotice, he is basically an expert in content creation workloads and he has done multiple videos comparing zen 4 with the 13th gen.



That's not the 7950X3D/7900X3D, where are those results? You've quoted the non X3D parts @Bencher.
 
Last edited:
He hasnt tested them yet. Are you saying the normal 7950x is actually less efficient than the 13900k but the 3d isn't?
I don’t know, but the TPU results of 7950X3D Vs 13900K are pretty one sided in power draw, not just in games either looking at their app testing. Sure there were a few apps where X3D had similar or slightly higher power draw, but largely it’s pretty one sided on that front.

I wonder what the yearly cost would be in those games/apps where the 13900K is drawing three times the power, never mind 20% more in some scenarios.

I agree the Intel chips with monolithic design appear to draw less power at idle, but it’s another level of copium to make a big deal out of 20% difference in some scenarios whilst ignoring the mass abundance of scenarios where the roles are reversed and then some.
 
Last edited:
  • Like
Reactions: J.D
I don’t know, but the TPU results of 7950X3D Vs 13900K are pretty one sided in power draw, not just in games either looking at their app testing. Sure there were a few apps where X3D had similar or slightly higher power draw, but largely it’s pretty one sided on that front.

I wonder what the yearly cost would be in those games/apps where the 13900K is drawing three times the power, never mind 20% more in some scenarios.

I agree the Intel chips with monolithic design appear to draw less power at idle, but it’s another level of copium to make a big deal out of 20% difference in some scenarios whilst ignoring the mass abundance of scenarios where the roles are reversed and then some.
Proper efficiency comparisons are done with identical power limits. TPU doesn't test with same power limits so they are kinda irrelevant. Case in point, if stock power limits were actually important in efficiency comparisons, then intel has by far the most efficient cpus with their T line of SKUS being limited to 35W. I would be amazed if a 13900t (or even a 13700t, 12900t etc.) do not absolutely smash the 7950x and the 7950x 3d in every single workload in terms of efficiency.

At similar power levels amd has a 10 to 20% lead in efficiency depending on the application in heavy MT workloads, but it loses badly in more mixed workloads like autocads premieres and the likes.. Gaming is a wash between the 7950x and the 13900k but the 3d does much better thanks to lower clockspeeds.

For my workloads, a 12900k is far more efficient than any amd CPU - it's not even a contest. After 5-8 hours of actually working I end up at 8w to 15w average power draw. That's less than what amd consumes at idle.

EG1. Also TPUS results are - as usual - completely flawed when it comes to power draw. We know that zen 4 idles at 20 to 50w depending on the motherboard, how the heck did TPU manage to have an application running while drawing less than idle power? Should I assume that running webxprt makes the cpu draw less than idle? :D
 
Last edited:
My system is on 24/7 so that’s one of the reasons I use adaptive since most of the time it’s idle. My 4090 also consumes a lot less idle wattage than my 3090 before it so that was a nice side benefit.

I like that he showed actual running costs for a year since that’s a nonsense argument used repeatedly.

I still have that bug where my 3090 consumes 100w at idle when connected to multiple screens. I cannot for the life of me get to idle any lower even after a fresh windows install.
 
Proper efficiency comparisons are done with identical power limits. TPU doesn't test with same power limits so they are kinda irrelevant. Case in point, if stock power limits were actually important in efficiency comparisons, then intel has by far the most efficient cpus with their T line of SKUS being limited to 35W. I would be amazed if a 13900t (or even a 13700t, 12900t etc.) do not absolutely smash the 7950x and the 7950x 3d in every single workload in terms of efficiency.

At similar power levels amd has a 10 to 20% lead in efficiency depending on the application in heavy MT workloads, but it loses badly in more mixed workloads like autocads premieres and the likes.. Gaming is a wash between the 7950x and the 13900k but the 3d does much better thanks to lower clockspeeds.

For my workloads, a 12900k is far more efficient than any amd CPU - it's not even a contest. After 5-8 hours of actually working I end up at 8w to 15w average power draw. That's less than what amd consumes at idle.

EG1. Also TPUS results are - as usual - completely flawed when it comes to power draw. We know that zen 4 idles at 20 to 50w depending on the motherboard, how the heck did TPU manage to have an application running while drawing less than idle power? Should I assume that running webxprt makes the cpu draw less than idle? :D

If absolute power efficiency is the goal then you picked an awful system to work with.

Don’t leave your system sat idol and jump in the Ryzen pool. The water is lovely.
 

Upcoming Intel Raptor Lake Refresh should still be part of the 13th Gen Core series​


 
If absolute power efficiency is the goal then you picked an awful system to work with.

Don’t leave your system sat idol and jump in the Ryzen pool. The water is lovely.
I guess those who run their machines flat out for rendering or encoding etc would benefit from Ryzen, it now makes sense why all office machines iv seen at various businesses run intel since the pcs will be on all the time and doing basic stuff like web, excel word etc and @Bencher has shown intel to offer far better power efficiency at those sorts of loads.
 
I guess those who run their machines flat out for rendering or encoding etc would benefit from Ryzen, it now makes sense why all office machines iv seen at various businesses run intel since the pcs will be on all the time and doing basic stuff like web, excel word etc and @Bencher has shown intel to offer far better power efficiency at those sorts of loads.
Yeah,, in flat out all core heavy workloads amd is better. Not as much as people think -but it is. Around 10-15% in most workloads at ISO wattage.
 
I guess those who run their machines flat out for rendering or encoding etc would benefit from Ryzen, it now makes sense why all office machines iv seen at various businesses run intel since the pcs will be on all the time and doing basic stuff like web, excel word etc and @Bencher has shown intel to offer far better power efficiency at those sorts of loads.

AMD offer better options in the majority of workloads. You are talking about a very bias view of the 12900K Atom core performance in very specific scenarios, a chip that is just about EOL and pretty poor choice for those kinds of workloads.
 
Back
Top Bottom