• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

14th Gen "Raptor Lake Refresh"

Well, TBH I think right now Intel is the best choice for a *new* DDR4-based setup, while on DDR5 it's IMHO a tie between 13th gen and 7kX3D when you focus on performance/$.

If I had unlimited money I'd probably go for a 7950X3D but if I have to buy right now on a reasonable budget I'll go for 13700k and DDR4.
 
The CPU area is quickly becoming a cesspit (I guess it already is). Worse than the GPU section.
Both sides are as bad as each other. It's never enough to accept that the CPU space is pretty awesome at the moment. We always have to say one is crushing/destroying/owning/embarrassing the other.
Can you guys just calm down and use the piece of silicon you've bought, rather than grabbing a box of tissues and lube, and getting into arguments? :D
 
Last edited:
The CPU area is quickly becoming a cesspit (I guess it already is). Worse than the GPU section.
Both sides are as bad as each other. It's never enough to accept that the CPU space is pretty awesome at the moment. We always have to say one is crushing/destroying/owning/embarrassing the other.
Can you guys just calm down and use the piece of silicon you've bought, rather than grabbing a box of tissues and lube, and getting into arguments? :D
Just wait until a new player like Zhaoxin will start making decent CPUs or Microsoft will find a way to make ARM viable for gaming, it will be like silicon wrestling!
If we're lucky the late 2020s will be like the late 1990s with multiple players both on CPU and GPU... I miss that old experimental age...
 
That's what you do all the time, Dave2150 has exposed you multiple times.
On what? Nonsense
Do you really think anyone believes your story about switching from 13900k to 12900k xD?
I don't really care if anyone believes me or not. I have the receipts if you like them

I also have benchmarks on my youtube channel.

But as ive said, facts don't matter to you :D
 
Last edited:
Well, TBH I think right now Intel is the best choice for a *new* DDR4-based setup, while on DDR5 it's IMHO a tie between 13th gen and 7kX3D when you focus on performance/$.

If I had unlimited money I'd probably go for a 7950X3D but if I have to buy right now on a reasonable budget I'll go for 13700k and DDR4.

Right now in all honesty if i was building a new system i would seriously look at a 13700K, they are just a little expensive, if the 13600K had 8 real cores i would buy that at £300 and it would be a DDR5 system.

The E-Cores are meaningless to me, in fact i don't like them, take the 8 E-Cores off and give me 8 real cores instead, its stupid, the only real reason Intel are doing this stupid stupid design is because its the only way they can keep up with AMD in MT, stupid stupid bloody things.

Seriously i like Intel's current CPU's, overall, i think they are good, but those E-Cores are the dead mouldy fly in that delicious soup.

@Bencher Intel are in deep ####. AMD are just slapping them about like a kipper. That's simply fact.

 
Last edited:
Right now in all honesty if i was building a new system i would seriously look at a 13700K, they are just a little expensive, if the 13600K had 8 real cores i would buy that at £300 and it would be a DDR5 system.

The E-Cores are meaningless to me, in fact i don't like them, take the 8 E-Cores off and give me 8 real cores instead, its stupid, the only real reason Intel are doing this stupid stupid design is because its the only way they can keep up with AMD in MT, stupid stupid bloody things.

Seriously i like Intel's current CPU's, overall, i think they are good, but those E-Cores are the dead mouldy fly in that delicious soup.

@Bencher Intel are in deep ####. AMD are just slapping them about like a kipper. That's simply fact.



This is an old(ish) screenshot. But it does show the E-cores of my 12700k being utilised.

20211210100007-1.jpg


do you mean that you feel the E-Core "concept" is flawed rather than them not being used, I know you didn't mention that :)...?
 
Last edited:
IMHO e-cores have been a necessary development step, after all you don't go from monolithic to quasi-SOC in one go.
We're only a few generations away from uneconomical die-shrinks for the consumer sector and you can optimize a generalistic CPU only so much so it makes sense to find ways to move to a cluster of specialized chips that are more efficient for specific use-cases and can be switched off to save power when not needed.
In a way it's nothing truly new, Intel pulled it off before: https://en.wikipedia.org/wiki/Intel_80387SX
 
This is an old(ish) screenshot. But it does show the E-cores of my 12700k being utilised.

20211210100007-1.jpg


do you mean that you feel the E-Core "concept" is flawed rather than them not being used...?

I think they complicate matters, i don't like complicated, i've been doing this for 30 years and have had some finicky components in my time, ATI GPU's, Bulldozer and Skt 478 Pentium CPU's....
I'm pushing 50 now, i'm done with all that.
I plugged my 5800X in about 3 years ago, spent a few weeks tuning the memory to within a inch of its life, from 3200Mhz to 3800Mhz. and then left it, i haven't been in the BIOS since, because it just works, flawlessly. And these days that's how i like it.

Those E-Cores require some work from Intel and / or the game developer for the game to work properly, Intel can't always be there, and i like to play some games that are very Indy, IE a million miles from EA and such publishers, some games that are obscure or still under development or very old.

There are still problem with these CPU's in such games, i don't like those E-Cores, to me they feel like hackery.
 
Last edited:
The E-Cores being efficient argument is just Intel contriving reason for something that exists for a reason they don't want to admit to, its typical PR.
Like Intel making a big thing about how they can turn tiles off on their MCM CPU's, as if to say "look at our high tech efficiency engineering" that was mentioned to an AMD guy who just giggled and said "We do that even on our monolithic APU's, and have done for years" Its PR hyping something that everyone is doing as if its something fantastic, its what they all do but Intel have always been shouting and screaming we are awesome look at me like an attention seeking toddler, more than anyone else and often to their detriment because AMD are always listening to what they are screaming about for in 5 years time....
 
I think they complicate matters, i don't like complicated, i've been doing this for 30 years and have had some finicky components in my time, ATI GPU's, Bulldozer and Skt 478 Pentium CPU's....
I'm pushing 50 now, i'm done with all that.
I plugged my 5800X in about 3 years ago, spent a few weeks tuning the memory to within a inch of its life, from 3200Mhz to 3800Mhz. and then left it, i haven't been in the BIOS since, because it just works, flawlessly. And these days that's how i like it.

Those E-Cores require some work from Intel and / or the game developer for the game to work properly, Intel can't always be there, and i like to play some games that are very Indy, IE a million miles from EA and such publishers, some games that are obscure or still under development or very old.

There are still problem with these CPU's in such games, i don't like those E-Cores, to me they feel like hackery.

I have more than ten years on you, lol :)
I get simplicity, perhaps also reflecting my own lack of interest into the "exotic" to optimise performances. Hence, to a certain point, a little hesitation in trying an AMD platform for a desktop build over that of an Intel. Maybe that is influenced by familiarity.
I will be looking soon to build another desktop and as much as I am tempted by an AMD build I might just end up going to an Intel one, seeing what their refresh will offer.

Testing out a Red Devil LTD edition 6800XT was nothing but issues for me, great with AAA titles but much less so with lesser known and Indie games. That is why that was refunded and a RTX 3070 bought. Not my first choice, but it just worked.
 
What would you rather have

6 P-Cores + 8 E-Cores?
or
8 P-Cores?
I would go for 6+8 as it would probably perform a lot better. Lots of threads/software don’t need big core’s so it makes sense to have low power cores. AMD will add some sort of E-Core, probably low power normal cores first but at some point, they will go full-on E-Cores.
 
I have more than ten years on you, lol :)
I get simplicity, perhaps also reflecting my own lack of interest into the "exotic" to optimise performances. Hence, to a certain point, a little hesitation in trying an AMD platform for a desktop build over that of an Intel. Maybe that is influenced by familiarity.
I will be looking soon to build another desktop and as much as I am tempted by an AMD build I might just end up going to an Intel one, seeing what their refresh will offer.

Testing out a Red Devil LTD edition 6800XT was nothing but issues for me, great with AAA titles but much less so with lesser known and Indie games. That is why that was refunded and a RTX 3070 bought. Not my first choice, but it just worked.

Going slightly off topic but yeah, i have been with Nvidia for 8 years now, i am willing to give AMD's GPU a chance tho when i upgrade my 2070S, which is long overdue, its likley to be AMD this time, and maybe an Intel CPU too next year :D maybe...
 
Last edited:
I would go for 6+8 as it would probably perform a lot better. Lots of threads/software don’t need big core’s so it makes sense to have low power cores. AMD will add some sort of E-Core, probably low power normal cores first but at some point, they will go full-on E-Cores.

If these E-Cores were for efficiency they would be much more 'that' in notebooks vs AMD, they aren't....

AMD will be doing a similar thing with Zen #C, they say its just to cluster more cores together, denser packaging, and they are for hyper-scalers, not for desktop, or notebooks, they will not be retail at all.
 
Last edited:
Going slightly off topic but yeah, i have been with Nvidia for 8 years now, i am willing to give AMD's GPU a chance tho when i upgrade my 2070S, which is long overdue, its likley to be AMD this time, and maybe an Intel CPU too next year :D maybe...



indeed it is somewhat off-topic, last point by me.....



AMD-for-Workers.jpg


Not particular looking for blame, either with AMD or some of the Indie developers. It is what it is. The developer of the above game has only recently acquired an AMD GPU to try and resolve some of the crashing and other texture anomalies. However there were other Indie titles I play where the performance was sub par to say the least.
In the above noted game my 1080Ti performed better, and totally stable, over that of the 6800XT.
It is what it is, annoyingly, not helped by my own limited skills and the games I play, but Nvidia just work for me. Bah, in some ways.

Good luck at the point you decide to change......
 
Just wait until a new player like Zhaoxin will start making decent CPUs or Microsoft will find a way to make ARM viable for gaming, it will be like silicon wrestling!
If we're lucky the late 2020s will be like the late 1990s with multiple players both on CPU and GPU... I miss that old experimental age...

I think Microsoft’s strategy is AMD and X86.
 
Back
Top Bottom