• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
Only had the misfortune to suffer Intel's big.LITTLE copy on my work laptop, but there are plenty of things which run better with the E cores turned off. Plenty which don't. With the E-cores off one particularly complext SQL query was 35 mins vs 15 mins (and 11 mins on my old "ancient" Haswell laptop). Still I have left the E cores on as other tasks do use them.
Not gaming loads but I am pretty disappointed that Intel does not "work without any of this".
For gaming it works. Literally never had an issue with anything other than Star Citizen and you turn them off with a scroll lock and it’s sorted.

If this 7950x 3d worked better with them just left on then that would be great but it doesn’t.

I’m not wanting to turn this into another Intel vs AMD argument. There are lots of negatives on the Intel side also.
 
Last edited by a moderator:
Only had the misfortune to suffer Intel's big.LITTLE copy on my work laptop, but there are plenty of things which run better with the E cores turned off. Plenty which don't. With the E-cores off one particularly complext SQL query was 35 mins vs 15 mins (and 11 mins on my old "ancient" Haswell laptop). Still I have left the E cores on as other tasks do use them.
Not gaming loads but I am pretty disappointed that Intel does not "work without any of this".


Yeah I agree. The e-cores stink for gaming. Though AMD dual chiplet approach with one 3D cache is also a mess and even much worse and more trouble than Intel leaving e-cores on even in WIN10 sadly. Though I do not like either approach and both are problematic.

Why oh why AMD did you not just release the 7800X3D now. I may need to switch mobos in attempt to combat RTX 40980 coil whine by switching up components yet again and I have had nothing but DDR5 XMP trouble on Intel side. So AMD is last hope.

And only way is buy a grossly expensive 7950X3D disable one CCD and best 8 core gaming CPU in existence though well tuned 13900K with fast DDR5 and e-cores off may beat it in many cases. Though can I get reliable DDR5 XMP. Maybe need to turn down ring frequency which I never thought would impact DDR5 stability on Asus boards?
 
Last edited:
If you add voltage using curve optimiser (+ instead of -) in theory SP might improve, but almost certainly performance will go down. SP rating should be judged off BIOS default settings.

Changing stuff in BIOS and RM generally a bad idea. You pick one and stick with it, I prefer BIOS but RMT can be handy for monitoring.

Put BIOS back to default settings, dial in your XMP or memory settings and rerun the test. Assuming your memory is stable y cruncher should pass.

For diagnosing which core is which, use this table.

core 0 = 0/1 logical core
core 1 = 2/3 logical core
core 2 = 4/5 logical core
core 3 = 6/7 logical core
core 4 = 8/9 logical core
core 5 = 10/11 logical core
core 6 = 12/13 logical core
core 7 = 14/15 logical core
core 8 = 16/17 logical core
core 9 = 18/19 logical core
core 10 = 20/21 logical core
core 11 = 22/23 logical core
core 12 = 24/25 logical core
core 13 = 26/27 logical core
core 14 = 28/29 logical core
core 15 = 30/31 logical core
Thanks,

RMT uninstalled for now, just set everything up from fresh in the BIOS, the issue it seems is just not appreciating the logical to physical core ,so adjusting the wrong one!

Clearing CMOS did reset SP back to 110, and all looking good!
 
Finally the cheap boards Lisa promised are here, only 5 months later

No surprise its from ASROCK. I'm actually surprised it doesn't have DDR4 and DDR5 banks. They have a history of doing crazy stuff like that.
 
Yeah I agree. The e-cores stink for gaming. Though AMD dual chiplet approach with one 3D cache is also a mess and even much worse and more trouble than Intel leaving e-cores on even in WIN10 sadly. Though I do not like either approach and both are problematic.

Why oh why AMD did you not just release the 7800X3D now. I may need to switch mobos in attempt to combat RTX 40980 coil whine by switching up components yet again and I have had nothing but DDR5 XMP trouble on Intel side. So AMD is last hope.

And only way is buy a grossly expensive 7950X3D disable one CCD and best 8 core gaming CPU in existence though well tuned 13900K with fast DDR5 and e-cores off may beat it in many cases. Though can I get reliable DDR5 XMP. Maybe need to turn down ring frequency which I never thought would impact DDR5 stability on Asus boards?
How do the E cores stink for gaming? What games are you playing?
 
1% lows it makes no difference and if a thread gets caught on an e-core ouch. Like a thread getting stuck on another AMD CCD. No games at all have any significant benefit from more than 8 core 16 threads, Very few benefit even from more than 6 cores 12 threads at 4K


Those numbers already hammer AMD and e-cores are off as GoodOldGamer always turns them off as they do 0 for gaming. And yes they stink for gaming if game threads get stuck on them If game threads avoid them they really do not matter. But ouch if game threads get caught on too many of them.

So AMD CPUs must really stink badly as they get beat by Intel CPUs bad with e-cores off lol.
 
No games at all have any significant benefit from more than 8 core 16 threads
That applies to almost all games, but not all tbf. There are a few outliers that do benefit from more than 8 cores in specific scenarios. Think Spiderman and Cyberpunk, when running RT at maximum and having the GPU grunt (4090) to drive the FPS high enough to over 100 FPS +. It's niche, but it's still valid.
 
Last edited:
That applies to almost all games, but not all tbf. There are a few outliers that do benefit from more than 8 cores in specific scenarios. Think Spiderman and Cyberpunk, when running RT at maximum and having the GPU grunt (4090) to drive the FPS high enough to over 100 FPS +. It's niche, but it's still valid.

Could it happen though at 4K with RTX 4090 or only lower resolutions?

I mean I doubt those games maxed out with RT on at 4K are going to push 100 FPS or more. Heck most will struggle even to hit in the 80s and probably closer to 60 FPS or sometimes worse.
 
Could it happen though at 4K with RTX 4090 or only lower resolutions?

I mean I doubt those games maxed out with RT on at 4K are going to push 100 FPS or more. Heck most will struggle even to hit in the 80s and probably closer to 60 FPS or sometimes worse.
It's less common at 4K, unless you have a weak CPU that is not suited to being paired with a 4090.

It can happen at 4K, if you use DLSS to push the frame rate high enough though that the CPU becomes the limitation. It applies just to those games though, for now.

You could argue that's not running at 4K (using DLSS) and that's a perfectly legitimate argument tbh.
 
Last edited:
It's less common at 4K, unless you have a weak CPU that is not suited to being paired with a 4090.

It can happen at 4K, if you use DLSS to push the frame rate high enough though that the CPU becomes the limitation. It applies just to those games though, for now.

You could argue that's not running at 4K (using DLSS) and that's a perfectly legitimate argument tbh.

I see in your sig you have a 7950X3D. Do you have both CCDs enabled or just the 3D one which in affect makes it a 7800X3D?

And if both enabled, are you able to get game threads only on the 3D cache CCD so it performs just as well as the upcoming 7800X3D??

I have heard it is a nightmare and AMD driver does not work though Process Lasso could be used. Though even non X3D dual chiplet CCDs, I still would want game threads on one CCD due to penalty of latency to cross CCDs.
 
I see in your sig you have a 7950X3D. Do you have both CCDs enabled or just the 3D one which in affect makes it a 7800X3D?

And if both enabled, are you able to get game threads only on the 3D cache CCD so it performs just as well as the upcoming 7800X3D??

I have heard it is a nightmare and AMD driver does not work though Process Lasso could be used. Though even non X3D dual chiplet CCDs, I still would want game threads on one CCD due to penalty of latency to cross CCDs.
You don't need to do anything yourself (other than install chipset drivers, update game bar, don't disable game bar/game mode) it's all handled automatically.

Most games will use the CCD with cache on there as it's faster, however there are a few exceptions where certain games prefer higher clock frequency instead of cache. In those examples the game will use the second CCD for better performance.

If a game requests the use of more than 8 cores, see Spiderman as an example, it'll use both CCDs.

Whatever you heard is wrong, most like parroted by someone that has never used the processor.
 
Last edited:
You don't need to do anything yourself (other than install chipset drivers, update game bar, don't disable game bar/game mode) it's all handled automatically.

Most games will use the CCD with cache on there as it's faster, however there are a few exceptions where certain games prefer higher clock frequency instead of cache. In those examples the game will use the second CCD for better performance.

If a game requests the use of more than 8 cores, see Spiderman as an example, it'll use both CCDs.

Whatever you heard is wrong, most like parroted by someone that has never used the processor.


Are they wrong there.
 
That applies to almost all games, but not all tbf. There are a few outliers that do benefit from more than 8 cores in specific scenarios. Think Spiderman and Cyberpunk, when running RT at maximum and having the GPU grunt (4090) to drive the FPS high enough to over 100 FPS +. It's niche, but it's still valid.
Even warzone 2, I was ashamed to post the video in warzone 2 with ecores off. It dropped to 100 at times :cry:
 
It's less common at 4K, unless you have a weak CPU that is not suited to being paired with a 4090.

It can happen at 4K, if you use DLSS to push the frame rate high enough though that the CPU becomes the limitation. It applies just to those games though, for now.

You could argue that's not running at 4K (using DLSS) and that's a perfectly legitimate argument tbh.

Totally legitimate as you say. The amount of PCMR types who were banging on about the PS5 checkerboarding not being true 4K and yet are running fake 4K themselves when they use DLSS is amusing.
 
Back
Top Bottom