• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
Would get slated in reviews though, as would be identical performance etc. They have to make it 5% faster at least, only way to do so is 6.2Ghz clock, which I don't think they can get out of 10nm. We'll see, mystery at this point!
Are they definitely releasing a 13900KS or will they just go with the refresh in Q3? which I believe is a refinement rather than new architecture.

Unless AMD's performance gains here are more than they're letting on, I'm not sure there's much point in a KS product this gen unless it's something they make more profit margin on of course.
 
I know it's computerbase but they have a lot of articles. Ah, they gimped the 7000 series by running the slowest possible memory, explains the performance differences. Will be interesting to see if X3D can still perform well running on the slowest possible memory.

EDIT - They tested Warzone not MW2 unfortunately.
Well, since you have a 7950x with good ram lets test it. Can you run cp2077 at 1080p rt on? I have a 12900k and a 13900k with 7600c34 ram, lets see what's actually going on
 
Are they definitely releasing a 13900KS or will they just go with the refresh in Q3? which I believe is a refinement rather than new architecture.

Unless AMD's performance gains here are more than they're letting on, I'm not sure there's much point in a KS product this gen unless it's something they make more profit margin on of course.

13900KS is already released in some parts of the world, it's 100% a thing. Not a good thing, but a thing nonetheless.
 
Are they definitely releasing a 13900KS or will they just go with the refresh in Q3? which I believe is a refinement rather than new architecture.

Unless AMD's performance gains here are more than they're letting on, I'm not sure there's much point in a KS product this gen unless it's something they make more profit margin on of course.

The Core i9-13900KS is set to launch on January 12th officially
 
This... it seems a little 'weird' to have 2 different ccd's but I suppose it's no worse than performance/efficiency cores on intel.... assuming windows etc can determine and make use in the best way (meaning windows 11 only probably)

I'd have also thought there's potentially missing performance from not having 2 3d cache ccd's...

What's gonna be really funny is if at some point in the future there is games that like having more than 8 cores cause if windows is forced to split a game across both CCDs it's gonna introduce so much latency but we're still some ways from that, let's hope games running on both ccd's are "rare" as amd claims
 
Last edited:
What's gonna be really funny is if at some point in the future there is games that like having more than 8 cores cause if windows is forced to split a game across both CCDs it's gonna introduce so much latency but we're still some ways from that, let's hope games running on both ccd's are "rare" as amd claims
Not that I'd trust more than a handful of developers with this, but there is nothing to say that a game engine couldn't me made so that the various threads are mostly independent. Or at least that there they have some threads which need frequency, some which need cache but those two set of threads rarely have to talk (and when things are buzzing around at 5GHz+ rarely could even be once a ms or so).

Inter-CCD latency is only a problem if threads need to transfer lots of data - if process running on the other CCD only have to do some computational heavy thing and then store the results in memory, then as long as it can all be synchronised (hence my handful of engine developers comment) it shouldn't matter. Eventually.

However, Intel's hybrid architecture does have some gotchas in certain workloads. My work laptop is now Alder Lake and certain SQL query are over twice as slow as on my old Haswell laptop; improving if I disable the E cores but still worse than Haswell which is crazy.
 
What's gonna be really funny is if at some point in the future there is games that like having more than 8 cores cause if windows is forced to split a game across both CCDs it's gonna introduce so much latency but we're still some ways from that, let's hope games running on both ccd's are "rare" as amd claims

All modern games still prefer 8 cores. 12900k, 13900k etc don't see any limitations to only have the 8 P cores. It'll be the same for Ryzen 70003XD, the 8 cores without cache will be just used for the background applications etc.

Perhaps more an issue for those keeping the CPU until the next console launch. I assume the PS6 etc will have > 8 cores, then we'll see a sudden influx of games needing more than 8 'real' cores for gaming. Though PS6 is probably coming in 2026 at the earliest.
 
I was originally going to get a 7950x3d but not sure any more. At gaming at 4K I can’t imagine there is going to be a huge uplift over a 7950x, right?
Although if steaming/recording from one machine would it help?
 
Last edited:
Think you'll be better off waiting for reviews to drop from the decent sources who'll benchmark the 3D versions against the non 3D models across different resolutions.

Link below to article comparing the 5800X v 5800X3D with the 4090:


Generally big increases across the board at 1080p but less so at 4k as you become GPU bound, although there as still some big gains for certain games. But I suspect at 4k and with modern games running at max settings, you're going to be loading the 4090 fully. So there might not be a big difference from switching from your current CPU in my opinion (wait for the reviews).

What that review doesn't report is the minimum FPS each benchmark hit. The biggest difference I noticed with the 5800X3D is how much smoother gameplay is due to less frame drops in busy scenes.
 
I'd have no hesitation jumping on a 7800x3d at launch but if I was thinking about getting a 7900X3D or a 7950X3D on launch, I'd personally be waiting before dipping my toe in; only to see how the task scheduling will work across the two CCD's, keeping the workload on the correct CCD, if it works as expected, any reported issues, etc.
 
Perhaps more an issue for those keeping the CPU until the next console launch. I assume the PS6 etc will have > 8 cores, then we'll see a sudden influx of games needing more than 8 'real' cores for gaming. Though PS6 is probably coming in 2026 at the earliest.
If they maintain x86/x64 architecture the odds are they'll end up using AMD again, Intel just isn't that great on custom integrated chips so in theory it could end up being a 3D cache chip inside the PS6 (and xbox whatever) which could potentially benefit AMD long term.....

Mind you it all depends on how well the devs utilise multiple cores and threads....
 
Think you'll be better off waiting for reviews to drop from the decent sources who'll benchmark the 3D versions against the non 3D models across different resolutions.

Link below to article comparing the 5800X v 5800X3D with the 4090:


Generally big increases across the board at 1080p but less so at 4k as you become GPU bound, although there as still some big gains for certain games. But I suspect at 4k and with modern games running at max settings, you're going to be loading the 4090 fully. So there might not be a big difference from switching from your current CPU in my opinion (wait for the reviews).

What that review doesn't report is the minimum FPS each benchmark hit. The biggest difference I noticed with the 5800X3D is how much smoother gameplay is due to less frame drops in busy scenes.
And while I am sure TPU and W1zzard do collect min FPS data, they never publish it. And that really limits TPU's reviews IMO.

They do have nice big images charts though. W1zzard also doesn't want to join the other German reviewers and do any dynamic charts which is disappointing too.
 
Here's another one then (using a 3090ti though as opposed to the 4090, which would be better):


Unfortunately, the min FPS data isn't in the summary charts but you can see it per game benchmark.

Clearly this isn't going to show how what kind of uplift you'll get from these new 3D cache CPU's compared against the non 3D cache but does show you benefit from previous generation 5800X3D v 5800X.
 
Any ideas on prices yet? I look forward to the official reviews. If the 7900X3D if it's a worthy upgrade from the 7900X I might be tempted. :D

These new 3D cache CPU's will be great for those on AM4 who were in two minds about switching platforms to AM5.

If you're already running a Ryzen above the 7600x, then I personally don't think the outlay will be worth it.
 
Hi there

I shall add my thoughts whilst I can as AMD have given me zero information yet, as such I am not breaking any NDA with my thoughts and educated guess work. I'd hazard 7800X 3D is 10-15% faster than 5800X 3D, the 7900/7950 faster again in games/apps which can gain from the core count and high boost clocks.

Pricing I'd guess launch pricing as follows:

7950X 3D @ $899
7900X 3D @ $699
7800X 3D @ $499

Best case $100 below the above, worse case $100 above. Expect $100-200 discounts later in the year potentially around Black Friday.


P.S. 13900KS we have in stock, you can order it 12th January!
 
Last edited:
It'll blow the 13900KS away and make it look a total joke. That 120W TDP is amazing too, coupled with my 4090 @ 70% PL (350W), will mean I have quite an efficient system, considering the performance!

7950X3D for me too, where's that pre-order button, @Gibbo ? :D

Asus ROG Crosshair X670E Hero on offer at £629 as well, think I'll pick up the motherboard now as they're likely to go out of stock closer to release :p
I’d hardly say “blow it away”. Let’s face it at 4k there will be nothing in it.

I’m getting a 7950x3d also but to say it will make 13th gen “bin tier” is just silly.
 
Back
Top Bottom