• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
Aside from some shader compilation as you move through the level, Star Wars Jedi Survivor runs fine on X3D.

Wanted to put maximum strain and power draw on the CPU, so ran 1080P lowest. Sub 70W at all times and often closer to 60W under load, not bad at all tbf.
Star Wars Jedi: Survivor Benchmark | 7950X3D + 7900 XTX | 1080P Lowest Settings - YouTube

Will be swapping out the 7900 XTX and throwing a 4090 in there soon and will put up some 4K videos of the same sequence.
 
Last edited:
Aside from some shader compilation as you move through the level, Star Wars Jedi Survivor runs fine on X3D.

Wanted to put maximum strain and power draw on the CPU, so ran 1080P lowest. Sub 70W at all times and often closer to 60W under load, not bad at all tbf.
Star Wars Jedi: Survivor Benchmark | 7950X3D + 7900 XTX | 1080P Lowest Settings - YouTube

Will be swapping out the 7900 XTX and throwing a 4090 in there soon and will put up some 4K videos of the same sequence.

I just tested the same sequence with a 4090 + 7800X3D same settings and I get maybe 10-15% less fps comparing it side to side (I didn't check the average). What surprised me is that your 7950X3D runs around 10C cooler than my 7800X3D and I'm on a custom waterloop.
 
I just tested the same sequence with a 4090 + 7800X3D same settings and I get maybe 10-15% less fps comparing it side to side (I didn't check the average). What surprised me is that your 7950X3D runs around 10C cooler than my 7800X3D and I'm on a custom waterloop.
I need to redo the test with my 4090 as I think the AMD GPU is a bit faster at 1080P but a bit slower at 4K in this game, that's Nvidia's driver overhead in action.
 
What surprised me is that your 7950X3D runs around 10C cooler than my 7800X3D and I'm on a custom waterloop.

AFAIK, in a gaming scenario the 7950X3D will be shutting off the non-cached CCX and acting as an 8 core chip. I wouldn't be at all shocked if they binned the absolute best silicon for those, while the parts that run hotter ended up in the 7800X3Ds.

(That said I'm just assuming that 7950X3D will run lower vcore. If both are the same then it's curious indeed. Maybe heat being generated due to memory speed differences?)
 
Does Cyberpunk just hate AMD CPU's?
GPU-Z https://imgur.com/a/37zL4rQ
Am I doing something wrong here? Just upgraded from a 5950x a few days ago. Not very impressed with my performance in Cyberpunk using the same settings as a good few posts back.
(dropping into the mid 70's with 55% GPU load after the game loads in more NPC's), only 10-15 fps higher than my old 5950x + 4090 when CPU bound.
Performance in Spider-Man Remastered also seems lower than expected 80-100 fps with low GPU load.

@LtMatt performance seems to be miles above mine in these games.

AMD 7950X3D (Stock)
Corsair H115i ELITE CAPELLIX
Asus Strix X670E-E (BIOS 1401)
Corsair Vengeance RGB 32GB 6000MHz @30-36-36-76 1.4V (EXPO 2)
RTX 4090 FE - Driver 531.68
990 Pro 2TB NVMe
Corsair AX 1000W PSU

Fresh install of Windows 11 and updated.
Chipset drivers 5.02.19.2221
2x 3D cache processes running in task manager
Newest version of Game Bar which is definitely working due to games only using the 3D cache CCD.
  • KGL Version Loaded: 2240
  • KGL Service Version : 2240
Steam Overlay - Disabled
Rebar - Enabled
Hardware-accelerated GPU scheduling - Enabled
iGPU disabled in BIOS

Tried game mode on or off, with the game running on both CCD's - Results in lower fps.
Even tried wiping the config folder for Cyberpunk and made a fresh save file with the same results in that location.

Cinebench R23
35759 Multi core
I have big fps drop in cyberpunk, usually when I come out of a menu, such as inventory and back into the game. System is 7800x3d, 4090. 32gb ram 6000mhz running at 4800’mhz (auto bios settings) 980 pro ssd.

I have been reading on Reddit it’s an issue with DLSS, AMD CPU and the engine, apparently issue also exists in Witcher 3 aswell but I don’t notice the drops.
 
I need to redo the test with my 4090 as I think the AMD GPU is a bit faster at 1080P but a bit slower at 4K in this game, that's Nvidia's driver overhead in action.

Yep, from other videos it seems AMD GPUs run the game better compared to Nvidia, so that's not surprising.

How do you get the 7950X3D to run at such a low temps? Do you have an open testbench, or does the CPU run just cooler than the 7800X3D?
I'm running the 7800X3D with PBO enabled and -30 CO all cores. I'm finding that the 7800X3D runs hotter than expected. In games it's mostly over 60, which is clearly still fine.
I know temps are infleunced by many variables, but I didn't expect such a big difference when comparing same game, same settings, etc.
 
Yep, from other videos it seems AMD GPUs run the game better compared to Nvidia, so that's not surprising.

How do you get the 7950X3D to run at such a low temps? Do you have an open testbench, or does the CPU run just cooler than the 7800X3D?
I'm running the 7800X3D with PBO enabled and -30 CO all cores. I'm finding that the 7800X3D runs hotter than expected. In games it's mostly over 60, which is clearly still fine.
I know temps are infleunced by many variables, but I didn't expect such a big difference when comparing same game, same settings, etc.
Using a Corsair H150i Elite LCD, 100% fan speed and liquid metal on the CPU lid.

I have an O11 Dynamic with the side panels off as otherwise the GPU dumping heat inside the case increases CPU/GPU temps significantly.
eeHQwhb.jpg

Also the 7950X3D (and the other dual CCD AM5 parts) can sometimes run a little cooler as the heat load is a little more spread out rather than centralized on one CCD.
 
Last edited:
Using a Corsair H150i Elite LCD, 100% fan speed and liquid metal on the CPU lid.

I have an O11 Dynamic with the side panels off as otherwise the GPU dumping heat inside the case increases CPU/GPU temps significantly.
eeHQwhb.jpg

Also the 7950X3D (and the other dual CCD AM5 parts) can sometimes run a little cooler as the heat load is a little more spread out rather than centralized on one CCD.

I too have an O11 dynamic, but I'm running the waterloop with on the side 3 fans + rad for intake, and at the top 3 fans + rad for exhaust (no fans at the bottom).
I've rerun that section of the game without the side panel and with the fans at full speed and temps went down by 3-4 degrees.
So, it's probably a combination of all those things you mentioned, with also liquid metal playing a big role in that.
 
I have big fps drop in cyberpunk, usually when I come out of a menu, such as inventory and back into the game. System is 7800x3d, 4090. 32gb ram 6000mhz running at 4800’mhz (auto bios settings) 980 pro ssd.

I have been reading on Reddit it’s an issue with DLSS, AMD CPU and the engine, apparently issue also exists in Witcher 3 aswell but I don’t notice the drops.

I think that's a frame generation issue that appears to affect AMD CPUs only. I see it in Witcher 3 but not in Cyberpunk. Capping fps in W3 to 50 (100 with FG) seems to sort it out. I think @LtMatt has seen it in Cyberpunk.

Does Cyberpunk just hate AMD CPU's?
GPU-Z https://imgur.com/a/37zL4rQ
Am I doing something wrong here? Just upgraded from a 5950x a few days ago. Not very impressed with my performance in Cyberpunk using the same settings as a good few posts back.
(dropping into the mid 70's with 55% GPU load after the game loads in more NPC's), only 10-15 fps higher than my old 5950x + 4090 when CPU bound.
Performance in Spider-Man Remastered also seems lower than expected 80-100 fps with low GPU load.

@LtMatt performance seems to be miles above mine in these games.

AMD 7950X3D (Stock)
Corsair H115i ELITE CAPELLIX
Asus Strix X670E-E (BIOS 1401)
Corsair Vengeance RGB 32GB 6000MHz @30-36-36-76 1.4V (EXPO 2)
RTX 4090 FE - Driver 531.68
990 Pro 2TB NVMe
Corsair AX 1000W PSU

Fresh install of Windows 11 and updated.
Chipset drivers 5.02.19.2221
2x 3D cache processes running in task manager
Newest version of Game Bar which is definitely working due to games only using the 3D cache CCD.
  • KGL Version Loaded: 2240
  • KGL Service Version : 2240
Steam Overlay - Disabled
Rebar - Enabled
Hardware-accelerated GPU scheduling - Enabled
iGPU disabled in BIOS

Tried game mode on or off, with the game running on both CCD's - Results in lower fps.
Even tried wiping the config folder for Cyberpunk and made a fresh save file with the same results in that location.

Cinebench R23
35759 Multi core

Somewhat similar experience here but it is a decent bit better than the 5800X/X3D was.

 
I have big fps drop in cyberpunk, usually when I come out of a menu, such as inventory and back into the game. System is 7800x3d, 4090. 32gb ram 6000mhz running at 4800’mhz (auto bios settings) 980 pro ssd.

I have been reading on Reddit it’s an issue with DLSS, AMD CPU and the engine, apparently issue also exists in Witcher 3 aswell but I don’t notice the drops.
Yet that's an issue with Frame Generation where the game stutters coming out of the menu. I have that disabled here.
I just mean the actual CPU performance.
Take at look @LtMatt

His performance seems miles ahead of mine, with the same GPU and CPU.
I don't know if he has "Crowd density" on High though like I do, which could explain the large performance difference. I presume he has HDD mode disabled though.

Now mine....

On a side note, it's crazy how much the 12900k & 13900k utterly demolish AMD CPU's in this game though.
I mind CD Project Red used an Intel compiler that massively reduced AMD chip performance at launch. Probably still the case.
13900k vs 7950x. Almost a 60% difference at times.
 
Last edited:
Can I ask what chipset drivers people using? I was told by one dude on Reddit that I needed to download the chipset drivers from ASUS Armoury Crate over the AMD Divers. Apparently the ones ASUS have are more current than AMD's?
 
Last edited:
Can I ask what chipset drivers people using? I was told by one dude on Reddit that I needed to download the chipset drivers from ASUS Armoury Crate over the AMD Divers. Apparently the ones ASUS have are more current than AMD's?
Use the chipset drivers from AMD's website. Don't touch "Armoury Crate" unless you have to. It might as well be malware and is a complete pain to remove.
 
Last edited:
Back
Top Bottom