• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5700X3D and AM4 pricing

That looks incredibly disappointing, I don't know if I'll even bother eventually chucked a 5800X3D in next year over my 5700X, as it's not going to be 'that much' better than this 5700X3D :(
I only planned on doing it so I could chuck the 5700X into my 2nd machine a MITX, as it'd run nice and cool/quiet versus a 5800X/X3D... But I don't 'need' to upgrade either tbf, I just like giving the 2nd rig free upgrades, and makes wasting spending money easier :cry:

Hang on, the 5700X3D is 3GHz/4.1Turbo vs the 5700X 3.4GHz/4.6GHz :confused: why do they does this? Bit scummy :(

Wonder if I'll bother with a 5800X3D now, the 4070 is always at 96-100% so there doesn't seem to be a bottleneck, and everything I use it for is SP via tv/controller and easily plays anything at 4K dlss 60 capped for the 4k60 tv.

Some takeaways that are pro X3D from my above benchmarks:
- Cyberpunk GPU utilisation went from 96% to 99% with a 5 FPS increase
- TLOU GPU utilisation went from 95% to 100% with an 8 FPS increase
- The Shadow of the Tomb Raider benchmark showed CPU frames rendered etc went up massively, even though the final result was -1 FPS (margin of error and GPU limitation)
- Rise of the Tomb Raider had a weird minimum in Syria of 13 fps on the 5700X, while the minimum was 90 fps on the X3D - I wonder if the benchmark is wrong as the only noticeable drop is at the start of Geothermal Valley - possibly the drop is incorrectly attributed to the Syria part of the Benchmark

The temperature results and power utilisation baffle me. They're so much less in games compared with the 5700X, yet so much higher in Cinebench (can see the power utilisation goes up to 102W with the 5700X3D - proving it really is a 105W TDP chip like the 5800X3D).

I wouldn't be surprised if the biggest benefit from this upgrade is psychological. I was always wondering with my 5700X whether any drops I saw could have been avoided if I had an X3D chip. Plus I was also conscious of my sub-optimal RAM (3000 MHz CL15), which an X3D chip mitigates better than the non-X3D chips.

Time will tell if this upgrade is worth the £110+ cost (depending on the resale value of my 5700X). I'm guessing it wasn't in my case other than the psychological benefit of maxing out AM4 as mentioned above :cry:
 
Some takeaways that are pro X3D from my above benchmarks:
- Cyberpunk GPU utilisation went from 96% to 99% with a 5 FPS increase
- TLOU GPU utilisation went from 95% to 100% with an 8 FPS increase
- The Shadow of the Tomb Raider benchmark showed CPU frames rendered etc went up massively, even though the final result was -1 FPS (margin of error and GPU limitation)
- Rise of the Tomb Raider had a weird minimum in Syria of 13 fps on the 5700X, while the minimum was 90 fps on the X3D - I wonder if the benchmark is wrong as the only noticeable drop is at the start of Geothermal Valley - possibly the drop is incorrectly attributed to the Syria part of the Benchmark

The temperature results and power utilisation baffle me. They're so much less in games compared with the 5700X, yet so much higher in Cinebench (can see the power utilisation goes up to 102W with the 5700X3D - proving it really is a 105W TDP chip like the 5800X3D).

I wouldn't be surprised if the biggest benefit from this upgrade is psychological. I was always wondering with my 5700X whether any drops I saw could have been avoided if I had an X3D chip. Plus I was also conscious of my sub-optimal RAM (3000 MHz CL15), which an X3D chip mitigates better than the non-X3D chips.

Time will tell if this upgrade is worth the £110+ cost (depending on the resale value of my 5700X). I'm guessing it wasn't in my case other than the psychological benefit of maxing out AM4 as mentioned above :cry:
Personally 5-13FPS isn't worth £230 to me, or a bit more for £269.99 for a 5800X3D.

I haven't had any issues with my 5700X in SOTTR personally with my 4070.

Yeah those temps suck, we shouldn't have to 'tweak' them either in the bios - I don't mind doing it, but it's a bit half arsed isn't it TBH for 'out the box' as I imagine it starts throttling around those 80C temps and above?

Yeah for my use, purely gaming, it seems like it's going to be a waste of money, but I'll still do it so I can throw the 5700X into my MITX rig that has a 3600 in - but that doesn't really 'need' it, although if I do, it means I can chuck a 6700XT dual fan small card in and be able to play CS2 at the desk (as the main rig in my sig lives under the TV and is only used for SP)

FWIW when the rig in my sig that has the 5700X lived briefly at the desk, I had no issues for example getting 465FPS at 1440P native and had no bottlenecks/issues with 1% lows and all that jazz, so again I doubt a X3D would give me an edge their either...

I have 3600mhz CAS16 LPX, so maybe a 5800X3D would enjoy that?
 
Wonder if I'll bother with a 5800X3D now, the 4070 is always at 96-100% so there doesn't seem to be a bottleneck, and everything I use it for is SP via tv/controller and easily plays anything at 4K dlss 60 capped for the 4k60 tv.

If you are playing at a capped 60 fps then there's likely no point in upgrading. In most games that shouldn't be too difficult a task for your current CPU. Sure there might be the odd game that is helped by cache especially on the 1% lows but for the most part unnecessary. Ofc if you switch displays to something with a much higher refresh rate and are then chasing a target of 120 fps or 144 or even higher then it would make more sense. :)
 
If you are playing at a capped 60 fps then there's likely no point in upgrading. In most games that shouldn't be too difficult a task for your current CPU. Sure there might be the odd game that is helped by cache especially on the 1% lows but for the most part unnecessary. Ofc if you switch displays to something with a much higher refresh rate and are then chasing a target of 120 fps or 144 or even higher then it would make more sense. :)
Yeah that's what I thought mate. This rig is purely used for 4k60 tv SP gaming with a controller :) So I just cap the fps to 60.

The 4070 reaches 96-100% in all games at either native 4k on slightly older games or 4k dlss3.5/frame gen/reflex for anything new, I've yet to find anything that it doesn't hit 60fps+ even with the highest graphics settings :) So I'm pretty happy 'for what I want' from it.

This will always be used as a tv SP gaming rig, and I've no plans to change my tv :D
The tv also does 1440P as well, so at worst in the future I can always just run 1440P with dlss/frame gen/reflex, as required...
I reckon this rig will tide me over a good few years yet, and in Sept it'll be a year since I finished building it, so 3-4 years should be satisfying enough before upgrading again/donating parts to my 2nd rig which is a MITX AM4.
When I eventually 'need' a more powerful GPU, I'll chuck the 5800X3D in, if it's good enough for a 4090, I can't see any issue with anything newer next generation wise.
 
Last edited:
It’s not just 5-13..

Pubg @ALXAndy as shows around 40.
Squad @me the only change was the cpu I went from 80/90 to 125/140

Yeah massive difference even at 1440p when you want every frame you can get. I came from a 3000 series (and the fastest of all on clock speed, 3950x) and it was well worth doing. Whether I would do it if I were on a 5000? I doubt it.

It's not the clock speed that makes these fast gaming chips. That really has little to do with anything. It's the cache. Even the 5600X3D (if you can get it) shows enormous gains.

Now obviously that doesn't bode through all games, but it sure seems to with the UE game I play on it (PUBG). Everything else I play is balls old like HL2 (completed it 12 times last year).

Also it should be kept in mind that I do not use the max settings on PUBG as it just makes it harder. IE, tree density and eye candy make it far harder to see people which makes the game ridiculous. So the work all comes down to the CPU, even at 1440p. It's not a tremendously hard game to run by any means, but to get it staying well above 144 FPS at 1440p is no mean feat for a CPU. My GPUs do that easy with the settings I use.

I couldn't play it like Tiggleton does though. Jesus, he must go bosseyed at that res and settings. Just a blurred mess.
 
Thinking about picking one of these up for the winter. Not really playing anything to taxing at the moment. So my old trusty 3600x still maxes everything out I play 144hz/1440p. But would like to do another play through of Cyberpunk and finish off the last of us which both play locked 72fps but Cyberpunk definatly gave the old 3600 a work out ( with the odd setting tweak).

I'd pick one up now but I know it will sit on the desk for months as I dont really play the computer once spring/summer is here.
 
I can’t see a 5800X3D being fast enough for a 5080. You’ll definitely be dropping a lot of performance with an Nvidia card.
Yeah I agree, I was tbh more leaning on a 2nd hand 4090 next year if I bother upgrading, old or not they'll demolish 4K DLSS/FG/Reflex at 60fps 1:1 with my tv for many many years, as I say though the 4070 has zero issues playing anything at those settings currently :) So there is no need to change anything, as it's still achieving higher than 60fps doing the aforementioned, so fair bit of headroom :)

The beauty of my situation is I can always upgrade without a real loss, due to having the 2nd AM4 rig that I can donate the main rig's parts too as and when I 'need' to upgrade :D
 
Last edited:
Now flogged my 5700X for £130, so was 'only' £100 lost going to the X3D, which helps ease my buyer's remorse.

Looking at my Cyberpunk and TLOU examples, I might have gone from slightly CPU limited to perhaps having way more untapped CPU potential.

TLOU: 5700X was 72% utilisation, dropped to to 52% on the X3D (-18W) + 8 FPS
Cyberpunk: 5700X was 75% utilisation, dropped to 58% utilisation on the X3D (-13W) + 5 fps

Not sure if I should read too much in to that. With DX12 Shadow of the Tomb Raider, I saw my FX-8350 go to 100% utilisation all cores all threads for the first time in a game, and it was still a 30 fps experience in crowded areas, so perhaps shouldn't expect future scenarios / games that have higher CPU workloads to result in FPS scaling linearly with CPU utilisation.

Perhaps also I should have benchmarked at lower resolutions. Probably should have benchmarked games that are a dumpster fire when it comes to CPU optimisation like Starfield and Hogwart's Legacy, but I don't own those and fortunately aren't interested in them.
 
Last edited:
It’s pretty outdated now, although I do still have the motherboard.
Have you upgraded since? What you running now? If not do you find you can still get what you need done? I've seen people with that gen ryzens run 7900 XT's haha, mad when you think about it! Bottleneck MUCH!
But then tbf, it depends what you want to do :)
 
Last edited:
Have you upgraded since? What you running now? If not do you find you can still get what you need done? I've seen people with that gen ryzens run 7900 XT's haha, mad when you think about it! Bottleneck MUCH!
But then tbf, it depends what you want to do :)

I use a few systems. That board has had few chips since.
 
Back
Top Bottom