• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Starfield CPU performance reviews

Just fitted the new 5800X3D with a Noctua U12A dual fan heatsink and not seeing an immediate improvement in Starfield. Temps are pretty good, not going over 50c at the moment.

Just to check, whilst not gaming the CPU is sitting at 3600Mhz and it is boosting to 4450Mhz in Starfield with 35% utilization when running Starfield.

Also updated my Gigabyte X570 elite bios to 38e which covers the 5800X3D
 
Last edited:
Just fitted the new 5800X3D with a Noctua U12A dual fan heatsink and not seeing an immediate improvement in Starfield. Temps are pretty good, not going over 50c at the moment.

Just to check, whilst not gaming the CPU is sitting at 3600Mhz and it is boosting to 4450Mhz in Starfield with 35% utilization when running Starfield.

Also updated my Gigabyte X570 elite bios to 38e which covers the 5800X3D

Starfield is a weird game performance wise.

99% or other games will show you a performance uplift.

This game seems to scale with memory bandwidth.
 
Just fitted the new 5800X3D with a Noctua U12A dual fan heatsink and not seeing an immediate improvement in Starfield. Temps are pretty good, not going over 50c at the moment.

Just to check, whilst not gaming the CPU is sitting at 3600Mhz and it is boosting to 4450Mhz in Starfield with 35% utilization when running Starfield.

Also updated my Gigabyte X570 elite bios to 38e which covers the 5800X3D

Not surprising. It performs best on the latest Intel stuff and doesn't seem too care much about cores/threads. The 5800x3d's single core performance is way lower than intel 12th and 13th Gen and AMD 7XXX series. It performs the best on a 13900k, most likely because that is the fastest single core cpu out there.

I mean, look at the rendering benchmarks here and single core performance in cinebench and povray:


The 5800x3d is massively slower (single core perf wise). The 13900k is 52% faster in cinebench and 71% faster in POV ray on a single core.

I imagine those massive differences are translated somewhat in Starfield's cpu performance, alongside a decent uplift with very fast ram too.
 
Last edited:
Yeah I think I'll stick to original plan and go AM5 next year with some shiny new DDR5 etc. I don't think I'd notice enough of an upgrade to swap the CPU now for the sake of 6 months to a year. Of course it will be Zen 5 next year as well (I think?) so there will be more choice (and value if you go for a discounted 7000 chip).
 
Yeah I think I'll stick to original plan and go AM5 next year with some shiny new DDR5 etc. I don't think I'd notice enough of an upgrade to swap the CPU now for the sake of 6 months to a year. Of course it will be Zen 5 next year as well (I think?) so there will be more choice (and value if you go for a discounted 7000 chip).
Was going to upgrade myself and drop 2.5k on an am5 system but i’ve decided to delay it for a year and do a mini 3700k to 5800x3d upgrade
 
The problem I have with the V-cache CPUs, is they are limited to ~5Ghz (less for the 5000 series). I hope they can avoid clock limitations for the Ryzen 8000 series. CPUs like the 7700X can be clocked quite high under air, and I think in many situations, I might just rather have a higher clocked CPU.
 
Last edited:
I'd say my 5800X3D and 7600 are more or less equivalent. You may as well wait and jump in at the 8000 series or a much reduced 7800X3D

According to this article at 1080p as of the 10th of March 2023 the 5800x3d was the 6th fastest CPU in games. (5th fastest at 1440p)

Faster than a 7700x, 7900x and a 7950x in games.

Basically only the 13th series Intels and the AMD 7 series x3ds are faster.

Also the 7600 is a 6 core. I'm pretty sure a few recent titles are starting to show a difference in performance now between the 6 and 8 cores.
 
Last edited:
According to this article at 1080p as of the 10th of March 2023 the 5800x3d was the 6th fastest CPU in games. (5th fastest at 1440p)

Faster than a 7700x, 7900x and a 7950x in games.

Basically only the 13th series Intels and the AMD 7 series x3ds are faster.

Also the 7600 is a 6 core. I'm pretty sure a few recent titles are starting to show a difference in performance now between the 6 and 8 cores.
Yes there will be on a chart but to be honest in normal use I don't notice it. As a stop gap the 7600 is perfectly adequate.
 
At least I make sure both are tuned to perfection. Rather than somehow making Nvidia faster in a game where everyman and his dog knows that AMD GPUs are faster. Gotta get those YT clicks for money somehow I guess.

Meanwhile, in reality. (1440P Lowest settings + spot cache Ultra)
CQ5VA4W.png

ZLSVDmS.jpg

Feel free to see if you can beat my 4090 result, @Grim5. I'll be surprised if you get within 20 FPS of it. If you do, I'll give it another run as I can probably add a few more FPS onto the result. Regardless, It's not going to be faster than the AMD GPU - unless you are framechasers that is.

Welcome. Robert's results I actually trust. regardless of what hardware he is using.
Those 4090 results are way over what the average user should expect to achieve. With tuned A die memory and a 7950x3d Curve Optimiser tuned, with an Overclocked 4090 I get just over 500 in the average CPU. That’s with gamebar, game mode on, up to date chipset drivers and a fresh install of windows 11 with as much bloat removed as possible.

Your numbers are always way over what I get. Am I missing something settings wise? I have friends with the exact same system and some of theirs are even worse with everything tuned etc.
 
Not surprising. It performs best on the latest Intel stuff and doesn't seem too care much about cores/threads. The 5800x3d's single core performance is way lower than intel 12th and 13th Gen and AMD 7XXX series. It performs the best on a 13900k, most likely because that is the fastest single core cpu out there.
Exactly the same with Oblivion and probably Skyrim as well games engine just works that way
 
I tested AMD Fluid Motion Frames (AFMF) in Starfield, almost double the FPS at 4K max settings.

I made an interesting discovery too regarding AFMF. If the game supports FSR 3 natively, you can stack FSR 3 and AFMF together, which gives you triple the injected FPS. Seeing is believing! :cry:

As far as I can tell, I can't see any visual or performance issues from combining the generated frames of FSR 3 and AFMF. This will probably gain traction soon, so you heard it here first! :cool:
 
I tested AMD Fluid Motion Frames (AFMF) in Starfield, almost double the FPS at 4K max settings.

I made an interesting discovery too regarding AFMF. If the game supports FSR 3 natively, you can stack FSR 3 and AFMF together, which gives you triple the injected FPS. Seeing is believing! :cry:

As far as I can tell, I can't see any visual or performance issues from combining the generated frames of FSR 3 and AFMF. This will probably gain traction soon, so you heard it here first! :cool:

The hurdle I can see with AFMF is needing good frame rates to begin with. Doesn't AMD suggest having 70fps for 1440p. That's already playable so why add fake frames.

And on a side note, Is your GPU really pulling 460watts and 3k fan speed?

The room you are in must be sweltering have 500-600watts dumped into it. Must feel like having the heating on.

I can feel it after a good gaming session and my PC kicks out about 350w when gaming.
 
The hurdle I can see with AFMF is needing good frame rates to begin with. Doesn't AMD suggest having 70fps for 1440p. That's already playable so why add fake frames.

And on a side note, Is your GPU really pulling 460watts and 3k fan speed?

The room you are in must be sweltering have 500-600watts dumped into it. Must feel like having the heating on.

I can feel it after a good gaming session and my PC kicks out about 350w when gaming.
That video is at 4K max settings.

It’s the same with FG on Nvidia, both require a certain base frame rate otherwise the added latency is not worth it. Once you hit a certain FPS, the latency hit is reduced and you move into latency benefit, as the injected FPS are high enough to give you overall lower latency. The recommended minimum is 70 at 1440P, but I think 60+ works okay too.

The main benefit will be users who get this level 70+ baseline fps, can now max out their high refresh rate monitor with higher fluid FPS. The higher the base FPS, the less latency introduced via AFMF. You can feel the extra smoothness at higher FPS values via native. In that video above, you can feel a noticeable difference between native 65 FPS vs FSR3 + AFMF 195 FPS. The overall latency has dropped nicely due to the higher FPS, even when you account for the added latency from the tech. You can see the added FG latency in the video above as there’s a special metric for it in AMD Software.

You can also enable Anti Lag in AMD Software to further reduce latency. Forspoken and many other games also support Anti Lag+ which makes even bigger reductions to latency which really helps when using this tech. It’s similar to Nvidia reflex.

Yea GPU is maxed out as it always is for YT videos. 467W is the max the GPU you can draw. That’s nothing compared to when I put my 4090 in the system, I can have that drawing over 600W in Cyberpunk at 4K.

I have wall mounted AC to remove the heat generated by the many computer/display and electrical components in my home office.
 
Last edited:
I tested AMD Fluid Motion Frames (AFMF) in Starfield, almost double the FPS at 4K max settings.

I made an interesting discovery too regarding AFMF. If the game supports FSR 3 natively, you can stack FSR 3 and AFMF together, which gives you triple the injected FPS. Seeing is believing! :cry:

As far as I can tell, I can't see any visual or performance issues from combining the generated frames of FSR 3 and AFMF. This will probably gain traction soon, so you heard it here first! :cool:
Any idea when it will be available.
 
Back
Top Bottom