• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Have Intel's Efficiency cores made a difference to you?

still uses too much power when need multithreaded performace you would think would be less considering its called hybrid and efficiency , will be interesting to see the next installment of it with MTL
 
Last edited:
still uses too much power when need multithreaded performace you would think would be less considering its called hybrid and efficiency , will be interesting to see the next installment of it with MTL
They are not called efficiency cores for power efficiency though, they are die space efficient. The GC (P cores) cores are actually more efficient than the gracemont (ecres) cores at same wattage.
 
Whoever says that doesn't understand what ecores do and why they exist. Long story short, ecores allow intel to create a big beefy core for those single threaded / gaming workloads without having to sacrifice multithreaded performance for it.

It's because the Ryzen 5 7600X doesn't have it,so they on purpose try and say the E-cores make no difference. So suddenly gamers don't stream or video encode. Suddenly productivity benchmarks don't matter,even though a lot of people were saying Ryzen was better because of moar cores,meaning better productivity and streaming performance. Suddenly it went from Ryzen has moar cores than Intel,meaning better multi-tasking performance is important,to nobody cares anymore.Then sudden ignoring of reviews which show the Core i5 13600K holding its own or being better in games. Suddenly the weak IGP being of tantamount importance,but wasn't important when Intel had an IGP and Ryzen didn't. Suddenly,lower priced motherboards and lower price RAM are not important.

Anybody who is being objective about this can see the 8 E-cores on the Core i5 13600K act like a downclocked Core i7 9700 because of their Skylake level performance. So that helps immensely in general performance. Intel is just offering you more for the money with a £300+ Core i5. All because AMD priced a six core Ryzen 5 7600X above £300 and a Ryzen 7 7700X above £400.

Almost all the reviewers are recommending the Core i5 13600K over the Ryzen 5 7600X.Ever since Zen3,and the whole flip-flop about pricing,it seems all the talking points which were Intel became AMD talking points,and the AMD talking points became Intel ones.

It gets even worse when AMD is charging more per core for a Ryzen 5 and Ryzen 7 than a Ryzen 9. The Ryzen 5 7600X should be £290 and the Ryzen 7 7700X should be £385 just going by OcUK Ryzen 9 pricing.
 
Last edited:
and probably needs refining will be interesting to see what they do with meteor lake

I suspect,as with Zen4,capping TDP and undervolting might help drop power consumption a decent amount:

AMD will have an advantage because they are on a better node,but it's quite clear both AMD and Intel,are pushing through too much voltage to justify the highest pricing they can get away with. Essentially these CPUs are overclocked out of the box.
 
I suspect,as with Zen4,capping TDP and undervolting might help drop power consumption a decent amount:

AMD will have an advantage because they are on a better node,but it's quite clear both AMD and Intel,are pushing through too much voltage to justify the highest pricing they can get away with. Essentially these CPUs are overclocked out of the box.

yeah same as GPU's , with better node+hybrid will be interesting to see
 
change the bloody name then its miss leading
Well technically, they are more efficient since they are clocked way lower. The point is they didn't put them there to increase efficiency, else they would just replace them with an equal number of P cores clocked also low. Anyways, I think hybrid is the way forward.
 
The problem is that they're nigh on useless in gaming, and the extent of their usefulness in production programs is debatable (compared to just more regular cores like AMD), all of which is compounded by the scheduler not being perfect for maximizing their use during multi-tasking (f.ex. I remember them tripping up quite easily when running Handbrake in the background; fixable if you know what you're doing, but shouldn't have to).

To me they will make a lot of sense when they can cram a lot more into the CPU for MT gains, so RPL looks more like what it should've been, but ultimately they still can't unseat AMD's 16 core behemoth, and they're still hopelessly lost on PPW. So what's the point? It's a nice gain for when you need better MT in the mid-range (12600k/13600k) but at the bottom end they're MIA and at the high end they're still not good enough. So it's just too niche in usefulness right now. If AMD decides to release a 7950X3D then they will absolutely obliterate Intel on the high-end and I don't see what they can respond with.

It wouldn't be such a glum situation if they could execute their roadmap better but alas, it's all just coming in too slowly.
Here we go again. What’s this got to do with AMD? Is it killing you that much that Intel are on top?

By “obliterate” do you mean 3fps @4k? Because that’s what it will most likely be. Do you honestly think the 3d chip is not going to bottleneck a 4090? No chance. If anything it will be a few fps, unless of course you wait around in Cyberpunk in a specific street between the hours of 8pm and 8am at 720p. The 7000 series is absolutely a waste of time, never mind Intels e cores, and look at the price of it and also motherboards.

What CPU do you have? AMD by any chance and an older one at that I bet.
 
Here we go again. What’s this got to do with AMD? Is it killing you that much that Intel are on top?

By “obliterate” do you mean 3fps @4k? Because that’s what it will most likely be. Do you honestly think the 3d chip is not going to bottleneck a 4090? No chance. If anything it will be a few fps, unless of course you wait around in Cyberpunk in a specific street between the hours of 8pm and 8am at 720p. The 7000 series is absolutely a waste of time, never mind Intels e cores, and look at the price of it and also motherboards.

What CPU do you have? AMD by any chance and an older one at that I bet.

I think people expected Raptor Lake to be just an overclocked Alderlake. It's quite clear why AMD itself is looking at a Hybrid cores too:

So I wonder when AMD does it,suddenly its acceptable?!

The E-cores make the Core i5 13600K/Core i5 13600KF easily beat the Ryzen 5 7600X in a lot of common applications,so much so it can match or beat a Ryzen 7 7700X or Ryzen 9 5900X/5950X CPUs:

When you have literally a bolted on and slightly downclocked Core i7 9700 on the side,what did people expect would happen?

Then there are all the people who bought Ryzen 7 and Ryzen 9 CPUs because they stream or game capture. Those 8 Skylake meme cores,are going to make this much better than a Ryzen 5 7600X. Then you have the fact you can use a Core i5 13600K/Core i5 13600KF in B660 DDR4 motherboard and not lose much performance either. Yes you could use a Ryzen 7 5800X3D too in a new AM4 build with DDR4,but the Core i5 13600KF seems faster in productivity.
 
Last edited:
Whoever says that doesn't understand what ecores do and why they exist. Long story short, ecores allow intel to create a big beefy core for those single threaded / gaming workloads without having to sacrifice multithreaded performance for it.
I cannot agree, respectfully. I am sorry, but I think you are mistaken. First off, let's just summarize really quickly here (or at least I'll give you my take on the situation) ... the e-core principle being deployed by Intel in it's current flagship processor line is nothing more than a scam to give the false "impression" that Intel is competing with AMD on a per core (core count) / thread basis. The reason I say scam is because gamers can't benefit from them nor can intensive applications / demanding workloads. In addition, E-cores are very weak with benchmarking - some benchmarks are actually better if you disable the e cores all together. This is VERY BAD. So you quickly find out there is really no need, no purpose and no data to show their usefulness. It all hinges upon some mysterious "scheduler" that is built into the new chips that we know nothing about. One easy way to disregard the economy cores and by simply examining older processors and their architecture. This is not a revolutionary design change and it's never been needed before. Why now? Why pair your most powerful die in history with something that belongs in a Chromebook? In a word: FLUFF

So in short, e-cores are nothing more than a bad sales gimmick. Intel simply does not want to look bad by bringing only 8 cores and 16 threads to the table when their competition, the 7950X, has double that. Sure, the 13900K even outperforms the 7950X in a number of benchmarks and everyone knows this but think about it from a sales perspective. AMD is bringing DOUBLE the core count and DOUBLE the thread count over Intel's flagship processor. Even if that is a better chip than what AMD has to offer, it still looks bad on intel for lacking HALF the cores of it's competition.

Enter the economy core. I can't believe I'm saying those words. The whole thing is preposterous beyond words.

*This is my personal theory and I could be wrong, it wouldn't be the first time LOL"
 
Last edited:
The reason I say scam is because gamers can't benefit from them nor can intensive applications / demanding workloads. In addition, E-cores are very weak with benchmarking - some benchmarks are actually better if you disable the e cores all together.
And the EXACT same applies to AMD CPUs. Do you think games and apps benefit from 2 ccds with 8 cores each? No they don't. Games specifically seem to work with one CCD completely off, and lots of apps run faster or the exact same on a 7700x vs a 7950x. So, what are you talking about???
In addition, E-cores are very weak with benchmarking - some benchmarks are actually better if you disable the e cores all together. This is VERY BAD. So you quickly find out there is really no need, no purpose and no data to show their usefulness

And again, the exact same thing is true for the 2ccds on the 7950x.Some benchmarks work better with 1 ccd off.

So, what is your point exactly?
 
And the EXACT same applies to AMD CPUs. Do you think games and apps benefit from 2 ccds with 8 cores each? No they don't. Games specifically seem to work with
And the EXACT same applies to AMD CPUs. Do you think games and apps benefit from 2 ccds with 8 cores each? No they don't. Games specifically seem to work with one CCD completely off, and lots of apps run faster or the exact same on a 7700x vs a 7950x. So, what are you talking about???

completely off, and lots of apps run faster or the exact same on a 7700x vs a 7950x. So, what are you talking about???


And again, the exact same thing is true for the 2ccds on the 7950x.Some benchmarks work better with 1 ccd off.

So, what is your point exactly?
I was making the pitch that economy cores are just an intel marketing ploy to give the impression that Intel is still competing with AMD on total core count. That's my only point. They might as well be dummy cores, that's about how useful they are. And going forward, I will be calling them dummy cores. Your logic must be called into question if you are trying to conflate a single CCD as identical in form and function to "economy cores" found on late generation Intel chips.

One simple question, do the CCDs have identical cores in each "chiplet"? Do the CCDs contain any economy cores whatsoever? If you answered YES and NO respectively, you should see my point very clearly.

Truth be told we know that e-cores are a sham simply by virtue of the fact that long established processor architectures do NOTHING of the sort. Had this "hybrid" technology actually been in use in say Xeon chips, or threadripper PROs, I could see the idea may hold some weight. But as I said before, this is simple marketing done by Intel to bolster thread/core counts to comparable AMD levels.

I will say it one more time. Intel is using these dummy/economy cores as nothing but "fluff" to make the processor appear more desirable for purchase because it competes with AMD on core count.

So what are you talking about? CCDs are definitely NOT economy cores.
 
I was making the pitch that economy cores are just an intel marketing ploy to give the impression that Intel is still competing with AMD on total core count. That's my only point. They might as well be dummy cores, that's about how useful they are. And going forward, I will be calling them dummy cores. Your logic must be called into question if you are trying to conflate a single CCD as identical in form and function to "economy cores" found on late generation Intel chips.
You can call them whatever you like, no one is stopping you. Im just saying those economy cores are as dummy as the extra cores on a 7950x. In each and every single workload that the extra CCD is useful (meaning, it produces better results), so are the economy cores. Period.
I will say it one more time. Intel is using these dummy/economy cores as nothing but "fluff" to make the processor appear more desirable for purchase because it competes with AMD on core count.
And someone could argue the same about the extra ccd on the 7950x. AMD is using that extra ccd as nothing but fluff to make the processor appear more desireable for purchase because it competes with Intel on multithreading performance.
So what are you talking about? CCDs are definitely NOT economy cores.
I don't care what type of cores they are, what they are named or how useful you think they are. I care about performance. Is there a scenario where the extra cores on the 7950x make a difference - while the economy cores don't? NOPE. Not a single one. Therefore, both are equivalent in terms of usefulness. That's not even something that should be debatable, yet here you are debating it. A comparison between a 13600k and a 7600x clearly demonstrates that you are absolutely horribly wrong. How is it possible in your opinion, since both has 6 good cores - for the 13600k to absolutely scorch the 7600x in pretty much everything multithreaded? Im sorry but if dummy / economy / fluff cores make my CPU 50++% faster (that's the mt difference between those 2 cpus), then by all means, fill my cpu with dummy / economy / fluff cores. ;)

I've posted a video playing spiderman on my 12900k, those ecores seem to be fully utilized, I was hitting 80 to 90% utilization constantly.

Here is the video

 
Last edited:
Back
Top Bottom