• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Ryzen 7 9800X3D

I like the look of these but I don't do any productivity tasks, are these boards okay for a gaming build?
I think for a single GPU setup the ASRock X870E Nova is the most recommended.

I use 2 GPUs so needed a board that has that, 10Gbe ethernet is a plus, the downside to the ProArt is that it has no debug digit display.

Then again if you are on X670 I don't think you need to upgrade, unless you are specifically missing something (e.g. need a board with extra PCIE slots or more PCIe 5.0 M.2 slots).
 
Disqualify?

There are disingenuous ways of presenting gains and defending a testing method.

The use of games in specific low gpu modes as a benchmarking tool is not a full image gaming test. It is a a benchmarking competition first and foremost.

Referring to results from a low gpu setting outside that scenario is a lie, either out of ignorance or deception. To borrow a broad broom, a lot of people do just that.

Thankfully there are reviewers who are happy to give the broader picture rather than just the best case scenario.
How is it a lie?

When i was weighing up a CPU in 2020 my choices were 10900K vs 5800X, most reviewers had them on par in games, TPU actually had the 10900K marginally ahead topping the charts, AnandTech used low res testing which put the 5800X consistently ahead, sometime by more than 20%, the fastest GPU at the time was the 2080 Ti, i'm still using it to this day with a GPU that's equivalent to a 3090, even a 3090 Ti the way i'm running it.
If you look at benchmarks since testing CPU's with a 3090 Ti or better you will see that even at 1440P the 5800X is consistently faster than the 10900K (no surprise to me) had i trusted the TPU results with the 10900K topping the charts simply by virtue of 2080 Ti run to run margins of error i could very easily have paid £50 more for a slower CPU.

A CPU winning at low resolution isn't going to flip and lose when you switch the load off the CPU to the GPU, they are all just going to look the same, that's a review of the GPU, not the CPU.

This is exactly what I mean. Waving massive gains at 720p low "sometimes more than 20%" as supposed evidence of more than what it is.

How much value did you get out of it in reality over 4 years with the graphics card you have. Let me illustrate what your performance is.

Here's TechPowerUp and their big chart of games when they tested the 7700 and a ton of other cpus from 720p to 4k at ultra settings with a 3080 which is similar to your 7800XT: https://www.techpowerup.com/review/amd-ryzen-7-7700-non-x/

I suppose for reference we should say such cards are about 1440p grade.

gainz.png


So you saved £50 by buying the cheaper out of two cpus that were similar in performance back then and today.

Your 720p low performance stayed at 720p low even years later. It is a lie to say it's representative of anything else.

A person looking to buy for games today and the near future at respectable settings can save hundreds by not spending on performance that will never be realised.

People should copy the reality of what you did by buying the cheaper out similar performing parts at realistic settings and they can get that information from gaming benchmarks of realistic settings.
 
I would be sceptical of any site that only lists averages and not individual games. I'd also say any site that is testing using several year old games is in no way useful information for performance for any upcoming game. If you want some reliable data on how the cpu will perform in the future look for sites testing black myth wukong since it's a unreal engine 5 game and that is probably going to be the most used game engine going forward (for example the witcher 4 will use it). If you are excited for monster hunter wilds look for Dragons Dogma 2 benchmarks since it uses the same game engine.

If you want to see 1440p/4k results on recent games, Digital Foundry has a done pretty good testing.

link is here, https://www.eurogamer.net/digitalfoundry-2024-amd-ryzen-7-9800x3d-review?page=2
Look for the resolution tab on each chart and below each chart you can choose frame rate/percentage difference.
To me it looks like even at 1440p/4k the 7800x 3d/9800x 3d are substantial improvements over any non x3d cpu depending on the game.
So there is a difference with 4k. Nice
 
This 1 min video has some comparisons at 4K:

Not much difference in some games but some improvement in others, though there's no 1% frame times, but obviously it should be better on the 9800X3D.
Looking at those stats it almost isn't worth upgrading from 5800X3D to 9800X3D for 1440p. Sure if you run 1080P but only small gains in 1440p and 4K is even worse
 
This is exactly what I mean. Waving massive gains at 720p low "sometimes more than 20%" as supposed evidence of more than what it is.

How much value did you get out of it in reality over 4 years with the graphics card you have. Let me illustrate what your performance is.

Here's TechPowerUp and their big chart of games when they tested the 7700 and a ton of other cpus from 720p to 4k at ultra settings with a 3080 which is similar to your 7800XT: https://www.techpowerup.com/review/amd-ryzen-7-7700-non-x/

I suppose for reference we should say such cards are about 1440p grade.

gainz.png


So you saved £50 by buying the cheaper out of two cpus that were similar in performance back then and today.

Your 720p low performance stayed at 720p low even years later. It is a lie to say it's representative of anything else.

A person looking to buy for games today and the near future at respectable settings can save hundreds by not spending on performance that will never be realised.

People should copy the reality of what you did by buying the cheaper out similar performing parts at realistic settings and they can get that information from gaming benchmarks of realistic settings.

With the RTX 4090.

7m7wlxQ.jpeg

CMsQtoS.jpeg


4ZmL2uL.png

I got ridiculed for posting these results as my justification for getting the 5800X.


The counterargument being this.


Again what the hell is with the 8700K to 10900K all landing with in a 5% margin?

Now yer' can't hide this performance difference AnandTech found, that's an 8% difference even at 1440P, and somehow the 14900K now tops the chart above the 7800X3D, What? TPU CPU reviews....

 
Last edited:
Is there no 16 core 9950X3D cpu to replace the 7950X3D cpu ?

Edit: Googled it and People think it due to be released sometime later in 2025
 
Last edited:
  • Like
Reactions: G J
With the RTX 4090.

7m7wlxQ.jpeg

CMsQtoS.jpeg


4ZmL2uL.png


Techpowerup always produces ridiculous numbers, especially cpu and gpu reviews and they also have some wild reviews of other products- they have a real hard on for cpu coolers where they'll claim something is a new holy grail when it can cool 1watt more

Anyway, I typically only check TPU for news these days and have stopped looking at their reviews for the most part. I used to think their game benchmark articles were ok, but in recent times I'm unable to replicate their results and often get better performance on my system than they claim and they also do a poor job with their image quality comparison articles
 
Last edited:
Techpowerup always produces ridiculous numbers, especially cpu and gpu reviews and they also have some wild reviews of other products- they have a real hard on for cpu coolers where they'll claim something is a new holy grail when it beats another cooler by 0.1c.

Anyway, I typically only check TPU for news these days and have stopped looking at their reviews for the most part

Yeah.... i don't use TPU for CPU reviews at all, GPU reviews are fine but when it comes to CPU's they are complete garbage.
 
With the RTX 4090.

7m7wlxQ.jpeg

CMsQtoS.jpeg


4ZmL2uL.png

After 4 years of saying 720p low performance mattered for the long run, you quote a £1800 graphics card as a requirement for a 5800X to be +8% than a 10900K at 1440p.

It's been +/- a few % all that time with affordable graphics cards at realistic settings.

You don't play at 720p low and you don't have a 4090 so that's 4 years of scamming yourself about the performance. But you did buy the cheaper cpu so that was £50 saved.
 
After 4 years of saying 720p low performance mattered for the long run, you quote a £1800 graphics card as a requirement for a 5800X to be +8% than a 10900K at 1440p.

It's been +/- a few % all that time with affordable graphics cards at realistic settings.

You don't play at 720p low and you don't have a 4090 so that's 4 years of scamming yourself about the performance. But you did buy the cheaper cpu so that was £50 saved.

I'll make the point again, had i trusted TPU i would have spent £50 more on a slower CPU, its only due to proper testing not done by TPU that i became informed enough not to make that bad choice.
 
Last edited:
I assume there's been no word yet if the CPUs have cleared customs etc?
My mobo is just waiting to have something brainy inserted in her. Last piece of my build.
(And a fairly important piece at that. Lol)
 
I'll make the point again, had i trusted TPU i would have spent £50 more on a slower CPU, its only due to proper testing not done by TPU that i became informed enough not to make that bad choice.

An easy decision when two cpus are neck and neck in realistic settings and one costs £50 less.

One I would recommend people make and pour a huge amount of salt on performance gains that only appear at substandard graphics settings.

I take your example to be my example for my point that testing at low graphical settings cannot be trusted as a measure of gaming performance outside that scenario.
 
Going from 2600x I imagine I am going to see more performance difference than most on here

Also bought a GPU on overclockers from the black friday section.... hopefully the deals later in the month don't end up being too much better than the early deals
 
Looking at other benchmarks including some using a 4080 super, it seems unless you are using a 4090 or 50 series equivalent there is actually no point getting this cpu over say a 9700x or 7800x3d due to gpu bottlenecks. You could argue for future proofing, but then you could just wait and get the next generation of cpus.
With how things are looking for the 50 series/AMD next, anything below the 5090/5080 will probably bottleneck a 9800x3d, so if you plan to upgrade to the new GPUs mainstream/lower end it may be best to wait if there is more aggresive discount of the latest CPUs early next year.
 
Last edited:
I got my 9800x3d for a few days. it crashed on cinebench 2024 with -40 negative curve/200 mhz boost very quickly. then i tried -35. completed a test succesfully but after some tests also unstable. then tried -33.. took several tests also unstable.

Now im at -30 and haven't had a single crash yet seemingly. Been gaming full on the last 2 days.
Seems pretty sweet.
I am not sure if i have clock stretch though. kinda annoying to figure out.
Also i set scalar to AUTO and not to 10x as some reviewers would make you believe.

tuf-gaming-x670e-plus + Kingston FURY 32 GB DDR5-6400 KF564C32RSAK2-32 with DOCP 1 - 6000 MHZ
I upgraded from a 7700x.
I have a 4090 with a 10% overclock.
Use case is mainly 7680x2160p monitor, sometimes 4k TV, and my Apple Vision Pro in PCVR which I believe if i got it correct uses render "4468x3916"

Will upgrade to 5090 asap and overclock it hopefully for atleast a 10% too.
 
Last edited:
I got my 9800x3d for a few days. it crashed on cinebench 2024 with -40 negative curve/200 mhz boost very quickly. then i tried -35. completed a test succesfully but after some tests also unstable. then tried -33.. took several tests also unstable.

Now im at -30 and haven't had a single crash yet seemingly. Been gaming full on the last 2 days.
Seems pretty sweet.
I am not sure if i have clock stretch though. kinda annoying to figure out.
Also i set scalar to AUTO and not to 10x as some reviewers would make you believe.

tuf-gaming-x670e-plus + Kingston FURY 32 GB DDR5-6400 KF564C32RSAK2-32 with DOCP 1 - 6000 MHZ
I upgraded from a 7700x.
I have a 4090 with a 10% overclock.
Use case is mainly 7680x2160p monitor, sometimes 4k TV, and my Apple Vision Pro in PCVR which I believe if i got it correct uses render "4468x3916"

Will upgrade to 5090 asap and overclock it hopefully for atleast a 10% too.
Any noticeable difference in games at 4k?
 
Back
Top Bottom