• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ryzen 7950X3D, 7900X3D, 7800X3D

Will you be purchasing the 7800X3D on the 6th?


  • Total voters
    191
  • Poll closed .
And remember that Linus showed the results to AMD who said they looked about right.
That's just rhetoric though, according to Linus, they finally admitted the CPU was borked and sent him a new one..

I'm just looking at other reviews and results, which vary, but not anywhere like to the same extent as LTT and in such a weirdly detrimental way.
 
his 'beef' seemed with scaling
And in the end (while raving about power efficiency) he pretty much explains how it happens, too aggressive downclocking at low loads. CPU thinks it has nothing to do at 4k and reduces clock speed.
Smells like something that could be addressed in future chipset driver update.
 
That's just rhetoric though, according to Linus, they finally admitted the CPU was borked and sent him a new one..

I'm just looking at other reviews and results, which vary, but not anywhere like to the same extent as LTT and in such a weirdly detrimental way.
The new CPU mostly fixed stability rather than performance though. Performance numbers that AMD thought looked right (or close enough).

I'm not saying Linus' results were right, just that AMD didn't say they were wrong (when asked).

To be fair to Linus he did say that even they were confused by their results, especially when comparing to other sites. It does also show that even experienced builders can have issues not getting the expected performance, so it could happen to us too.
 
Just interested if anyone has something positive to say about the 7950X3D still being the best CPU for gaming despite the initial simulated results of the 7800X3D out performing the 7950X3D, and some sites telling gamers to hold off on buying the 7950X3D and to wait for the 7800X3D instead?

I'm just curious if I'm reading into it incorrectly or do you think it's just media spin as with most pre launches.

I know there's no real way to tell except to wait for the 7800X3D release to be sure about the pure FPS data.

Confused I am.
 
Just interested if anyone has something positive to say about the 7950X3D still being the best CPU for gaming despite the initial simulated results of the 7800X3D out performing the 7950X3D, and some sites telling gamers to hold off on buying the 7950X3D and to wait for the 7800X3D instead?

I'm just curious if I'm reading into it incorrectly or do you think it's just media spin as with most pre launches.

I know there's no real way to tell except to wait for the 7800X3D release to be sure about the pure FPS data.

Confused I am.
Extremely positive. You will have to wait for the 7800X3D benchmark but it’s unlikely to be faster than a 7950X3D.
 
I tested F1 2022, Canada wet, 1 lap and captured the benchmark run for YouTube at the same time. It's completely GPU bound for me with the GPU at stock. I'll rerun it on the 4090 soon, but again I suspect It'll still be completely GPU bound.
X890RFg.png

I'm not sure how he has the 7950X performing better tbh.
SUCnFyu.png
 
I tested F1 2022, Canada wet, 1 lap and captured the benchmark run for YouTube at the same time. It's completely GPU bound for me with the GPU at stock. I'll rerun it on the 4090 soon, but again I suspect It'll still be completely GPU bound.
X890RFg.png

I'm not sure how he has the 7950X performing better tbh.
SUCnFyu.png
He has something wrong with his setup. You have quite a lot more performance than he got.
 
I tested F1 2022, Canada wet, 1 lap and captured the benchmark run for YouTube at the same time. It's completely GPU bound for me with the GPU at stock. I'll rerun it on the 4090 soon, but again I suspect It'll still be completely GPU bound.
X890RFg.png

I'm not sure how he has the 7950X performing better tbh.
SUCnFyu.png

He has something wrong with his setup. You have quite a lot more performance than he got.
Video
 
That's really interesting to watch. Thanks for sharing bud. I'm impressed that the CPU power draw stays around 85w-87w.

Looking at all the YouTube gaming benchmark content you have posted I haven't seen any game that has made the clock frequency go higher than 5250Mhz and 49% load? I think BF2042 has been the most demanding on the CPU so far.
 
That's really interesting to watch. Thanks for sharing bud. I'm impressed that the CPU power draw stays around 85w-87w.

Looking at all the YouTube gaming benchmark content you have posted I haven't seen any game that has made the clock frequency go higher than 5250Mhz and 49% load? I think BF2042 has been the most demanding on the CPU so far.
5.25Ghz is the max boost frequency. However MSI AB, which is what I use for my overlay due to low cpu footprint, shows the requested clock frequency not the actual clock frequency. Currently only two apps, HWINFO64 and Ryzen Master Tool show the actual (effective) clock speeds which show a more accurate true clock speed. Effective clock speed in games is lower than the requested clock speed. See a video I recorded earlier on BF2042 showing the effective clock speed from Ryzen Master Tool.


MSI AB cannot also accurately poll a sleeping Ryzen core, so all it does is show the last requested clock state for a core, which is almost always inaccurate. This often makes it look like a game is using all the cpu cores due to what is shown on my MSI AB overlay. However, the video above shows that is not always the case as you can see 6 of the 8 cores are actually asleep on the second CCD. When I ran BF2042 in that earlier video, you’d think all cores were awake and in use. So in short, don’t get hung up on the reported clock speed too much based on the MSI AB overlay.
 
Last edited:
5.25Ghz is the max boost frequency. However MSI AB, which is what I use for my overlay due to low cpu footprint, shows the requested clock frequency not the actual clock frequency. Currently only two apps, HWINFO64 and Ryzen Master Tool show the actual (effective) clock speeds which show a more accurate true clock speed. Effective clock speed in games is lower than the requested clock speed. See a video I recorded earlier on BF2042 showing the effective clock speed from Ryzen Master Tool.


MSI AB cannot also accurately poll a sleeping Ryzen core, so all it does is show the last requested clock state for a core, which is almost always inaccurate. This often makes it look like a game is using all the cpu cores due to what is shown on my MSI AB overlay. However, the video above shows that is not always the case as you can see 6 of the 8 cores are actually asleep on the second CCD. When I ran BF2042 in that earlier video, you’d think all cores were awake and in use. So in short, don’t get hung up on the reported clock speed too much based on the MSI AB overlay.
Good to know. Thanks!
 
5.25Ghz is the max boost frequency. However MSI AB, which is what I use for my overlay due to low cpu footprint, shows the requested clock frequency not the actual clock frequency. Currently only two apps, HWINFO64 and Ryzen Master Tool show the actual (effective) clock speeds which show a more accurate true clock speed. Effective clock speed in games is lower than the requested clock speed. See a video I recorded earlier on BF2042 showing the effective clock speed from Ryzen Master Tool.


MSI AB cannot also accurately poll a sleeping Ryzen core, so all it does is show the last requested clock state for a core, which is almost always inaccurate. This often makes it look like a game is using all the cpu cores due to what is shown on my MSI AB overlay. However, the video above shows that is not always the case as you can see 6 of the 8 cores are actually asleep on the second CCD. When I ran BF2042 in that earlier video, you’d think all cores were awake and in use. So in short, don’t get hung up on the reported clock speed too much based on the MSI AB overlay.

What the hell happened to MSI and AMD CPU's? for the last couple months its not really registering my CPU threads at all.


This is what it looks like now.


Compare that with how it should look.

 
interesting, were you able to reproduce this result many times?

I forgot what min fps you got with my timings but I'd rather take little worse min fps then lose so much fps.
Maybe you go this min FPS after 1st benchmark because that's what happens, first benchmark min fps is lower but 2nd run average is lower and min much higher. Not sure why.

i just saw your other result and yes the min fps is actually quite low for that CPU not sure why, maybe you need to make sure its 100% stable.
I get higher min fps on 7700x so yeah if i was you id make sure the ram is 100% stable thats why i said you might need around 1.47 VDD

Yeah my min FPS was very bad earlier, now its very good.

These settings are good for me for now. Because my screen is LG 42" C2 120hz OLED. So above 120 FPS is non-useless, am i right ?
 
I tested F1 2022, Canada wet, 1 lap and captured the benchmark run for YouTube at the same time. It's completely GPU bound for me with the GPU at stock. I'll rerun it on the 4090 soon, but again I suspect It'll still be completely GPU bound.
X890RFg.png

I'm not sure how he has the 7950X performing better tbh.
SUCnFyu.png

Brother your results always very higher than compared to anyone else. Are you also Overclocking GPU ? I cant reach your FPS, even i have RTX 4090 and 7950x3D...

Isnt 4090 beating 7900 xtx ? or i was wrong ? :D because your benchmark is way faster than 4090 i guess.
 
Last edited:
Brother your results always very higher than compared to anyone else. Are you also Overclocking GPU ? I cant reach your FPS, even i have RTX 4090 and 7950x3D...

Isnt 4090 beating 7900 xtx ? or i was wrong ? :D because your benchmark is way faster than 4090 i guess.
7900 XTX is running stock there. I need to test it on my 4090, I'm not sure if it will be faster. Can you share your score with me so I can compare? I'll then run it on the 4090/X3D.
 
Brother your results always very higher than compared to anyone else. Are you also Overclocking GPU ? I cant reach your FPS, even i have RTX 4090 and 7950x3D...

Isnt 4090 beating 7900 xtx ? or i was wrong ? :D because your benchmark is way faster than 4090 i guess.

Everything on his system is overclocked and tweaked (and those vary because he likes to tinker all the time), which is why it is not a good one to benchmark your own against. Not a criticism by the way ltmatt.
 
Last edited:
shows the requested clock frequency not the actual clock frequency. Currently only two apps, HWINFO64 and Ryzen Master Tool show the actual (effective) clock speeds which show a more accurate true clock speed.
Just for discussion sake don't know how much faith i would put in effective clocks either hwinfo shows impossible effective clocks on my stock 3800xt (5.5-47.5) multi times 100mhz fsb yet it reports 1.3mhz minimum clocks, same on the upper side, have a 34 duo not a great cooler but perfectly acceptable and maximum effective freq is just under 4.3 but core clocks section reports up to 4.725.
 
Back
Top Bottom