• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

14th Gen "Raptor Lake Refresh"

AMD offer better options in the majority of workloads. You are talking about a very bias view of the 12900K Atom core performance in very specific scenarios, a chip that is just about EOL and pretty poor choice for those kinds of workloads.
If it works for him than what's wrong with that.?
I for example love using emulators and since switching to AMD cpus the experience wasn't great. Even now Intel cpus are quite a bit faster than AMD in this regard.
If it wasn't for the fact that LGA1700 is a dead platform now I would go with 13700k instead of 7800x3d and It would cost me less and give me more multithreaded performance.

emulation-ps3.png


https://www.techpowerup.com/review/intel-core-i9-13900k/15.html
 
If it works for him than what's wrong with that.?
I for example love using emulators and since switching to AMD cpus the experience wasn't great. Even now Intel cpus are quite a bit faster than AMD in this regard.
If it wasn't for the fact that LGA1700 is a dead platform now I would go with 13700k instead of 7800x3d and It would cost me less and give me more multithreaded performance.

emulation-ps3.png


https://www.techpowerup.com/review/intel-core-i9-13900k/15.html
People are heavily biased towards amd, they refuse to accept that Intel is very very efficient in mixed workloads. What can you do
 
People are heavily biased towards amd, they refuse to accept that Intel is very very efficient in mixed workloads. What can you do
My 5800X never goes below 30W when idle and constantly jumps up.
When playing the last of us it sits at around 110-120W and that is with negative curve optimiser so undervolted.
Watching YouTube videos 13900k draws about the same maybe slightly more when running this game.
Here is mine:
 
Last edited:
If it works for him than what's wrong with that.?
I for example love using emulators and since switching to AMD cpus the experience wasn't great. Even now Intel cpus are quite a bit faster than AMD in this regard.
If it wasn't for the fact that LGA1700 is a dead platform now I would go with 13700k instead of 7800x3d and It would cost me less and give me more multithreaded performance.

emulation-ps3.png


https://www.techpowerup.com/review/intel-core-i9-13900k/15.html

I’m not sure what they want. Seems I get quoted every 5mins by the same few users... From what I gather they are looking at idle power use and championing the 12900k while ignoring everything else with better low use power consumption. Most enthusiasts systems probably pull 20 watts in RGB today.
 
If it works for him than what's wrong with that.?
I for example love using emulators and since switching to AMD cpus the experience wasn't great. Even now Intel cpus are quite a bit faster than AMD in this regard.
If it wasn't for the fact that LGA1700 is a dead platform now I would go with 13700k instead of 7800x3d and It would cost me less and give me more multithreaded performance.

emulation-ps3.png


https://www.techpowerup.com/review/intel-core-i9-13900k/15.html

Zen 4 x3d do well in emulation, pretty sure 7800x3d will be doing it using less power

aeyA5kw.jpg
 
Last edited:
Zen 4 x3d do well in emulation, pretty sure 7800x3d will be doing it using less power

aeyA5kw.jpg
Oh wow that’s a massive difference compared to non 3D chips.
Even PBO seems to boost performance even farther which isn’t the case with normal gaming most of the time.
Strange how much 3D cache helped 7000 series in this emulator and not at all 5000.
 
Last edited:
So about the same power consumption while pushing way more frames. Clearly Intel is not as bad as people are making it out to be.
Yeah in idle / light / mixed workloads they are very efficient. Except the 13900k which is a power hog in gaming, unless you underclock it. Even in full heavy mt usage, when testing at iso wattage amd is around 10 to 15% more efficient, with the exception of vray where the difference is 19%
 
I think that emulator takes advantage of avx512 which zen4 has and raptorlake doesnt. Another mistake on intels part
It appears so yes. Apparently it was possible to enable avx512 on early 12900k models by disabling e cores and that put it way ahead of Ryzen. Something like 70 fps in rpcs3
 
It appears so yes. Apparently it was possible to enable avx512 on early 12900k models by disabling e cores and that put it way ahead of Ryzen. Something like 70 fps in rpcs3
Yes, early bios and first batch 12900k like mine can do it. But ecores are giving a huge boost in recent games so I don't advice anyone to have them off.
 
So about the same power consumption while pushing way more frames. Clearly Intel is not as bad as people are making it out to be.

No, Intel are in terrible spot with power efficiency to performance, hence the financial trouble they find themselves in.

If AMD keep opening the lead and Intel keep rehashing while gambling on future nodes Intel could have some serious trouble ahead.
 
No, Intel are in terrible spot with power efficiency to performance, hence the financial trouble they find themselves in.

If AMD keep opening the lead and Intel keep rehashing while gambling on future nodes Intel could have some serious trouble ahead.
In this example @Bencher ’s 12900k is much more efficient than my 5800X.
 
It appears so yes. Apparently it was possible to enable avx512 on early 12900k models by disabling e cores and that put it way ahead of Ryzen. Something like 70 fps in rpcs3

right in thinking avx512 increases power a lot ? would be interesting how much less 7800x3d uses
v8o0EPs.jpg
 
In this example @Bencher ’s 12900k is much more efficient than my 5800X.

A single chip and information from a guy that openly admits he hates and will saying to undermine them. That will hardly save Intel from itself…

The reality is Intel are in deep trouble, and this isn’t peak trouble.
 
Last edited:
A single chip and information from a guy that openly admits he hates and will saying to undermine them. That will hardly save Intel from itself…
Is someone actually trying to save Intel here.? I’m looking for best cpu for my use case regardless of brand.
 
Last edited:
Is someone actually trying to save Intel here.? I’m looking for best cpu for my use case regardless of brand.

Probably not many TBH but I wouldn’t like to see Intel go under and Intel can’t continue as they are.

See if Bencher will sell you his setup. Hehe.
 
Is someone actually trying to save Intel here.? I’m looking for best cpu for my use case regardless of brand.
If you are not doing anything remotely heavy in terms of multithreaded, go Intel. Technotice has an interesting comparison with power from the wall for creators, the 13900k slaughters the 7950x in terms of both performance and efficiency. That's because photoshop premiere and the rest of Adobe suite is a mixed workload and in those Intel thrives. If you are running prime or cinebench on a loop in the other hand, go for amd, they have a 10-15% advantage.
 
Back
Top Bottom