• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
It'd be nice if we had a cpu gaming bench thread that was gatekept to keep noise makers outside. Not sure if the forum has such a feature.
I've tried creating bench threads here before, it's just a massive ball ache that I don't need in my life again anytime soon.

I'd rather just bench games and upload the results via video YouTube these days, away from forum benchmark wars. :cry:
 
Last edited:

Says there's almost no difference in fps between a 13900K and a 5800X3D in Cyberpunk.
Considering that (AMD claims) the 7000 X3D will be 10-25% faster than the 5800X3D I think the new X3D's will out perform the 13900K, even in Cyberpunk.
It's true, unless you use unrealistic settings for the most part there is barely any difference between all the high end CPUs when paired with a high end GPU at 1440P and above using the higher quality settings. Be wary of anyone here who says otherwise. ;)

Those results you linked show a 0.6% performance difference in Cyberpunk with a 4090 when using a 5800X3D and a 13900K at 1080P and 4K. As they used a 4090, they would have likely used maximum settings (maybe not with RT though as it doesn't specify) and according to the bench notes, they used 'custom' test scenes and not built in benchmarks.

Who knows what area they test, they never reveal which scenes they test. That's one thing I wish there was more transparency on if the viewer can copy the test scene to measure their own hardware's performance.
 
Last edited:
The ideal test scene is something which accurately represents the average FPS of the game. Picking something that is a worst or best case scenario is never ideal IMO. However TPU (and all tech sites) should include information about what their custom test scene consists of for each game. I've seen some of the computerbase test scenes, and they are like 25 second snippets of gameplay which again is a bit suspect. But I guess its up for debate how long a test scene should be etc.
 
Last edited:
It'd be nice if we had a cpu gaming bench thread that was gatekept to keep noise makers outside. Not sure if the forum has such a feature.
Yes I agree. Keep this Intel vs AMD childish ping pong match somewhere more relevant. Every thread gets bombarded by the same few. It’s pathetic.
 
The ideal test scene is something which accurately represents the average FPS of the game. Picking something that is a worst or best case scenario is never ideal IMO. However TPU (and all tech sites) should include information about what their custom test scene consists of for each game. I've seen some of the computerbase test scenes, and they are like 25 second snippets of gameplay which again is a bit suspect. But I guess its up for debate how long a test scene should be etc.

Agree. Wouldn't be too hard for game developers to create a "test profile" that would walk/run/drive a player around a map/city in the actual game. Not talking about an in-game benchmark, as these can be easily optimized for, but an actual test route through the game.

As you say then, we could simply load said profile and test against our own hardware/settings.
 
@LtMatt how are you comparing to Jansn Benchmarks he is using 6000 cl3 but 4090 so probably not ideal
sorry I know probably fed up testing now lol
I've always found his results to be pretty poor, regardless of which GPU he is using (4090 and 7900 XTX) with the 7950X.

As an example from his video below where we use the same game, settings and hardware.
9yyGeO4.png

Compared to the performance I see with the same (albeit tuned) hardware.
2LUFh3E.png
 
It's true, unless you use unrealistic settings for the most part there is barely any difference between all the high end CPUs when paired with a high end GPU at 1440P and above using the higher quality settings. Be wary of anyone here who says otherwise. ;)

Those results you linked show a 0.6% performance difference in Cyberpunk with a 4090 when using a 5800X3D and a 13900K at 1080P and 4K. As they used a 4090, they would have likely used maximum settings (maybe not with RT though as it doesn't specify) and according to the bench notes, they used 'custom' test scenes and not built in benchmarks.

Who knows what area they test, they never reveal which scenes they test. That's one thing I wish there was more transparency on if the viewer can copy the test scene to measure their own hardware's performance.
With a 4090 at 1440p and 1080p dlss q + rt you are surely cpu bound, even with a 13900k. So tpus numbers dont make much sense. Ill post some videos in a bit at 1440p + dlss q maxed out and youll see.
 
With a 4090 at 1440p and 1080p dlss q + rt you are surely cpu bound, even with a 13900k. So tpus numbers dont make much sense. Ill post some videos in a bit at 1440p + dlss q maxed out and youll see.
That would be pointless though since you are disputing the TPU results but then using completely different settings. TPU uses maximum settings with RT off in Cyberpunk at 1080P/1440P/2160P, so that's what you would need to compare to prove their results are wrong.
azRsvLq.png
 
That would be pointless though since you are disputing the TPU results but then using completely different settings. TPU uses maximum settings with RT off in Cyberpunk at 1080P/1440P/2160P, so that's what you would need to compare to prove their results are wrong.
azRsvLq.png
Uh, didnt see he has rt off. Then its completely pointless, with rt off the game is incredibly light on the cpu, even my 3700x hits 100 fps.
 
That would be pointless though since you are disputing the TPU results but then using completely different settings. TPU uses maximum settings with RT off in Cyberpunk at 1080P/1440P/2160P, so that's what you would need to compare to prove their results are wrong.
azRsvLq.png
TPU's 4090 numbers come from testing with a 5800X but this got changed for the AMD gpus to a 13900k which is why the 4090 numbers look lower.


 
  • Ryzen 7800X3D $449 - launches April 6th
  • Ryzen 7900X3D $599 - launches Feb 28th
  • Ryzen 7950X3D $699 - launches Feb 28th
Definitely a marketing strategy to force the impulse buy of the 12/16 core products. Either that of they have stock shortages of the 7800X3D and are not wanting back orders.

@Gibbo might as well be the first to ask. Can you speculate as to rough UK pricing based on current market conditions and if and when preorders will go live?
 
Definitely a marketing strategy to force the impulse buy of the 12/16 core products. Either that of they have stock shortages of the 7800X3D and are not wanting back orders.

@Gibbo might as well be the first to ask. Can you speculate as to rough UK pricing based on current market conditions and if and when preorders will go live?
I suspect it's a month later due to them expecting the 7800X3D to be the more popular product and wanting a bigger inventory. But then again, could be combination of both?

I would suspect that also with the exchange rate going at the moment you can just swap the $ to a £

Just wanna see the review now! Hopefully get at least the 7950X3D and 7900X3D reviews the day before
 
Last edited:
Back
Top Bottom