• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5700X3D and AM4 pricing

Associate
Joined
15 Oct 2018
Posts
1,314
I felt ready to pull the trigger on this, but seeing it's only 12% cheaper than the best 5800X3D deal is throwing a spanner in the works.

I went for a 5700X last year back when the 5800X3D was double the cost of it. I felt I made the right choice at the time, especially with my relatively weak cooler. Surprised to see the 5800X3D so competitively priced now, to the point it's making the newly launched 5700X3D almost unjustifiable to purchase.

I'm wondering if AMD is absolute awash with these AM4 X3D chips and the prices can only continue to fall, especially the 5700X3D in order to get some distance from the 5800X3D.

On the other hand the 5700X / 5800X pricing is remaining weirdly high.
 
I'm mostly using UW1440P @ 100 Hz. I've noticed dips in certain areas in The Last of Us P1 where my 7900 XT usage can drop to about 80%.

I played the Tekken 8 Demo at 4K60 and there were some drops during hectic cutscenes to 45 fps or so. I didn't catch whether my GPU was fully utilized as the MSI Afterburner text was set too small for that screen lol.

Just off the top of my head, I recall hearing UE5 games and even Cyberpunk doesn't benefit much from 3D V-Cache, so that's making me doubt about upgrading from the 5700X too. Still need to do a lot more research. Will keep an eye on the prices of 57/800X3D to see if either becomes a no-brainer at some point even if the benefit is uncertain.
 
Last edited:
I actually pulled the trigger on the 5700X3D, but it turned out the checkout of OCUK was in chaos earlier this week and the purchase was unsuccessful :cry:

Then found a new 5800X3D for a price just a bit higher, thinking I should take the gamble that I can under volt and PBO offset it enough to make sure it runs cool and quiet with my 140 mm AIO. Decided to sleep on it, but the next morning the price had bounced back up.

My trigger was playing The Last of Us. Walked from a no NPC area to a high NPC area, and suddenly my 7900 XT was only running 85-95% utilisation and dropping in to the mid 60 fps range, whereas before it was 75-80s+. That was with my 7900 XT undervolted and max power boost at -10%.

Decided at that moment a CPU upgrade was required.

Now in no man's zone as both the 5700X3D and 5800X3D remain just a tad too pricey to pull the trigger on. Not even 100% sure the X3Ds will sort the drop I saw in TLOU. I looked for info online about the game and mostly see people talking about the issues from its train wreck launch where every hardware configuration was a stuttering mess.

Sadly, my 5800X3D decided to up and die on me out of the blue a few days ago, a mere three months after I bought it.

Damn, there's a good advertisement for making sure you only buy an X3D chip new and not second hand / B-grade for a 10% discount.

Hope AMD gets a replacement sorted for you with no fuss.
 
Pulled the trigger (again :cry:) on the 5700X3D. This swung it for me, particularly the temps and power consumption compared to its bigger brother :eek:

 
For anyone interested, I took a snapshot of my 5700X versus my 5700X3D in the games I had installed. This is extremely unscientific and at 3440x1440 resolution @ around ultra (so generally expected to be GPU limited). I don't even have MSI afterburner set to show 1% / 0.1% lows, just averages. Here are the findings anyway:

Cyberpunk in a high population area:

SYSRygk.jpeg


Cyberpunk in built benchmark:

lvrLhuJ.jpeg


Rise of the Tomb Raider in built benchmark:

lIRrDyh.jpeg


Shadow of the Tomb Raider in built benchmark:

1qk8HjK.jpeg


The Last Of Us Part 1 - latest save where I was noticing an annoying FPS drop entering an NPC filled area:

dAs51Vc.jpeg


Cinebench after 5 mins:

xVlnKi8.jpeg


These are some extremely weird results. Largely they point to the 5700X3D not being done justice by the synthetic benchmarks on this instance.

ROTTR was notable in that at the beginning of Geothermal Valley there was no judder / single frame stutter, which has always happened on all the processors previously I've tested that with (FX-8350, Ryzen 2700, and Ryzen 5700X)

Temperature results are also very strange. I set my fan curves to the same for the 5700X and 5700X3D. I must admit, I suspect I've used different thermal paste though (I think I used Artic Silver for the 5700X, but used a Thermal Right paste for the 5700X3D).

The 5700X3D goes thermonuclear in Cinebench (hits 70C where all my fan curves are set to really ramp up) compared with the 5700X, yet it's opposite in games, with the 5700X3D being cooler and consuming less wattage than the 5700X.

Weird findings. Sure I'll get a better feel for the difference as I go.
 
That looks incredibly disappointing, I don't know if I'll even bother eventually chucked a 5800X3D in next year over my 5700X, as it's not going to be 'that much' better than this 5700X3D :(
I only planned on doing it so I could chuck the 5700X into my 2nd machine a MITX, as it'd run nice and cool/quiet versus a 5800X/X3D... But I don't 'need' to upgrade either tbf, I just like giving the 2nd rig free upgrades, and makes wasting spending money easier :cry:

Hang on, the 5700X3D is 3GHz/4.1Turbo vs the 5700X 3.4GHz/4.6GHz :confused: why do they does this? Bit scummy :(

Wonder if I'll bother with a 5800X3D now, the 4070 is always at 96-100% so there doesn't seem to be a bottleneck, and everything I use it for is SP via tv/controller and easily plays anything at 4K dlss 60 capped for the 4k60 tv.

Some takeaways that are pro X3D from my above benchmarks:
- Cyberpunk GPU utilisation went from 96% to 99% with a 5 FPS increase
- TLOU GPU utilisation went from 95% to 100% with an 8 FPS increase
- The Shadow of the Tomb Raider benchmark showed CPU frames rendered etc went up massively, even though the final result was -1 FPS (margin of error and GPU limitation)
- Rise of the Tomb Raider had a weird minimum in Syria of 13 fps on the 5700X, while the minimum was 90 fps on the X3D - I wonder if the benchmark is wrong as the only noticeable drop is at the start of Geothermal Valley - possibly the drop is incorrectly attributed to the Syria part of the Benchmark

The temperature results and power utilisation baffle me. They're so much less in games compared with the 5700X, yet so much higher in Cinebench (can see the power utilisation goes up to 102W with the 5700X3D - proving it really is a 105W TDP chip like the 5800X3D).

I wouldn't be surprised if the biggest benefit from this upgrade is psychological. I was always wondering with my 5700X whether any drops I saw could have been avoided if I had an X3D chip. Plus I was also conscious of my sub-optimal RAM (3000 MHz CL15), which an X3D chip mitigates better than the non-X3D chips.

Time will tell if this upgrade is worth the £110+ cost (depending on the resale value of my 5700X). I'm guessing it wasn't in my case other than the psychological benefit of maxing out AM4 as mentioned above :cry:
 
Now flogged my 5700X for £130, so was 'only' £100 lost going to the X3D, which helps ease my buyer's remorse.

Looking at my Cyberpunk and TLOU examples, I might have gone from slightly CPU limited to perhaps having way more untapped CPU potential.

TLOU: 5700X was 72% utilisation, dropped to to 52% on the X3D (-18W) + 8 FPS
Cyberpunk: 5700X was 75% utilisation, dropped to 58% utilisation on the X3D (-13W) + 5 fps

Not sure if I should read too much in to that. With DX12 Shadow of the Tomb Raider, I saw my FX-8350 go to 100% utilisation all cores all threads for the first time in a game, and it was still a 30 fps experience in crowded areas, so perhaps shouldn't expect future scenarios / games that have higher CPU workloads to result in FPS scaling linearly with CPU utilisation.

Perhaps also I should have benchmarked at lower resolutions. Probably should have benchmarked games that are a dumpster fire when it comes to CPU optimisation like Starfield and Hogwart's Legacy, but I don't own those and fortunately aren't interested in them.
 
Last edited:
Back
Top Bottom