• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

OcUK RX5800X3D review thread

How did Steve Walton get to +1% 1080P > 5800X3D, i was looking at it and thinking that can't be right, the 5800X3D seemed to be getting consistently bigger wins vs the 12900K, IE more than 1%, consistently.

So i did some maths.

The average frame rates at 1080P

5800X3D:
668 + 322 + 172 + 255 + 55 + 204 + 266 + 161 + 577 + 314 + 411 + 185 + 145 + 265 + 240 = 4240

12900K:
582 + 291 + 201 + 206 + 58 + 214 + 255 + 150 + 558 + 304 + 398 + 189 + 144 + 232 + 212 = 3994

4240 / 3994 = 1.061 That's 6%, not 1%.

https://www.techspot.com/review/2450-ryzen-5800x3D-vs-core-i9-12900k/
 
If i'm running 3200Mhz RAM, I assume it's worth OC'ing to 3600Mhz to maximise perf gains on the 5800x3d?

Depends on your timings, if you have to loosen timings a lot to get to 3600MT/s then no, its better that you keep the lower speed but tighter timings

For example if you're running CL14-14-14 @ 3200MT/s and need to loosen to CL18-18-18 then its not worth it, if you can run at CL16-16-16 that's ok go for it, anything tighter than that would be better.
 
Depends on your timings, if you have to loosen timings a lot to get to 3600MT/s then no, its better that you keep the lower speed but tighter timings

For example if you're running CL14-14-14 @ 3200MT/s and need to loosen to CL16-18-18 then its not worth it, if you can run at CL16-16-16 that's ok go for it, anything tighter than that would be better.
Thank you! I'll give it a look later
 
Depends on your timings, if you have to loosen timings a lot to get to 3600MT/s then no, its better that you keep the lower speed but tighter timings

For example if you're running CL14-14-14 @ 3200MT/s and need to loosen to CL18-18-18 then its not worth it, if you can run at CL16-16-16 that's ok go for it, anything tighter than that would be better.

I'm running 4x8GB of these:

https://www.overclockers.co.uk/team...3600mhz-dual-channel-kit-black-my-001-8p.html

Interesting to see how they run with the 5800X 3D.
 
These are some very nice RAM but 2X 16GB would have been better, 4 RAM sticks put a higher load on the IMC than just 2, which will have an effect on overclocking.

Always just 2 sticks if you can :)

Yes I would normally but I remember watching a review where they actually got higher performance with 4 sticks. Here it is:
 
How did Steve Walton get to +1% 1080P > 5800X3D, i was looking at it and thinking that can't be right, the 5800X3D seemed to be getting consistently bigger wins vs the 12900K, IE more than 1%, consistently.

So i did some maths.

The average frame rates at 1080P

5800X3D:
668 + 322 + 172 + 255 + 55 + 204 + 266 + 161 + 577 + 314 + 411 + 185 + 145 + 265 + 240 = 4240

12900K:
582 + 291 + 201 + 206 + 58 + 214 + 255 + 150 + 558 + 304 + 398 + 189 + 144 + 232 + 212 = 3994

4240 / 3994 = 1.061 That's 6%, not 1%.

https://www.techspot.com/review/2450-ryzen-5800x3D-vs-core-i9-12900k/

if he averaged up the games it was ahead by (percentages) in then took away the average of what they lost in and that gives a little over 1% better overall.

you can't directly compare the fps as they are different games so you look at percentage change. TBF what ever way you look at it, it is a flawed method of comparing the CPU's.
 
if he averaged up the games it was ahead by (percentages) in then took away the average of what they lost in and that gives a little over 1% better overall.

you can't directly compare the fps as they are different games so you look at percentage change. TBF what ever way you look at it, it is a flawed method of comparing the CPU's.

if he averaged up the games it was ahead by (percentages) in then took away the average of what they lost in and that gives a little over 1% better overall.

That's what i did, the result is 6%, not 1%, i even laid it out so you can run the maths for yourselves.

Each number i listed is the frame rate of each game, in cosponsoring order, you add them all up and then divide the totals, its the same games from each chart.

For example:

Valorant.
5800X3D: 668 FPS
12900K: 582 FPS

Fortnite.
5800X3D: 332 FPS
12900K: 291 FPS

CoD Warzone.
5800X3D: 172 FPS
12900K: 201 FPS

And so on....

Its really quite basic, to get averages between A and B you add up all results from A and B and then divide A by B.
 
Last edited:
That's what i did, the result is 6%, not 1%, i even laid it out so you can run the maths for yourselves.

Each number i listed is the frame rate of each game, in cosponsoring order, you add them all up and then divide the totals, its the same games from each chart.

For example:

Valorant.
5800X3D: 668 FPS
12900K: 582 FPS

Fortnite.
5800X3D: 332 FPS
12900K: 291 FPS

CoD Warzone.
5800X3D: 172 FPS
12900K: 201 FPS

And so on....

Its really quite basic, to get averages between A and B you add up all results from A and B and then divide A by B.

I see what Ross is saying. He's fine the average of the percentage difference for each game rather than totalling up the FPS for each then calculating. The logic being 1 FPS in Valorant is not equal to 1 FPS in Fortnite so can't be compared directly.
 
That's what i did, the result is 6%, not 1%, i even laid it out so you can run the maths for yourselves.

Each number i listed is the frame rate of each game, in cosponsoring order, you add them all up and then divide the totals, its the same games from each chart.

For example:

Valorant.
5800X3D: 668 FPS
12900K: 582 FPS

Fortnite.
5800X3D: 332 FPS
12900K: 291 FPS

CoD Warzone.
5800X3D: 172 FPS
12900K: 201 FPS

And so on....

Its really quite basic, to get averages between A and B you add up all results from A and B and then divide A by B.

That isn't what I did. Like I said you can't compare the fps straight. I added averaged the averages for the gains and did the same for the losses. take them away from each other and you get the 1% difference. You cannot compare fps v fps for different games


in bold - Its so basic you got it wrong. You are trying to calculate percentage change not averages. percentage change is (difference/original x 100) while to work out averages you add up each number then divide by the number or numbers.
 
That isn't what I did. Like I said you can't compare the fps straight. I added averaged the averages for the gains and did the same for the losses. take them away from each other and you get the 1% difference. You cannot compare fps v fps for different games


in bold - Its so basic you got it wrong. You are trying to calculate percentage change not averages. percentage change is (difference/original x 100) while to work out averages you add up each number then divide by the number or numbers.

Do you mean these values from the below? In which case you get 149 'above' and 92 'below', coming to 57 over 40 games is 1.425%

https://static.techspot.com/articles-info/2450/bench/1080p.png
 
Yes I would normally but I remember watching a review where they actually got higher performance with 4 sticks. Here it is:
4x8 work better than 2x8 cause the first are dual rank. 4x8 is similar to 2x16, both being dual ranked. On AMD it shouldn't make a difference cause the IF is your frequency cap / on Intel 2x16 is better cause less load on the IMC and the motherboard, since all of them nowadays are daisy chain. T-Topology stopped being a thing with z370 sadly.
 
That isn't what I did. Like I said you can't compare the fps straight. I added averaged the averages for the gains and did the same for the losses. take them away from each other and you get the 1% difference. You cannot compare fps v fps for different games


in bold - Its so basic you got it wrong. You are trying to calculate percentage change not averages. percentage change is (difference/original x 100) while to work out averages you add up each number then divide by the number or numbers.

What? why on earth would you do that?

That's 140 (Gains) / (Losses) 94 = 1.489. 49%

There is no way to get to 1%, or perhaps you would like to write out the numbers as you calculated? Because what you're explaining makes no sense.

You cannot compare fps v fps for different games

This also makes no sense, they are testing the CPU's in the same games, how are you arriving at these conclusions?
 
if this wasnt so late in the lifecycle of not only this gen cpu but also the end of the socket i would have considered a swap

I've upgraded because it was the end. Should keep any GPU I may get supplied for the next 3 years. Obviously if you're a regular upgrader then AM5 will be faster.
 
Do you mean these values from the below? In which case you get 149 'above' and 92 'below', coming to 57 over 40 games is 1.425%

https://static.techspot.com/articles-info/2450/bench/1080p.png

Does this make any sense to you?

I added averaged the averages for the gains and did the same for the losses

Or is that just complete nonsense? I would love to see that equation proof written because as you also pointed out averaging the result percentages is so far off "1%" its not even funny.
 
Does this make any sense to you?



Or is that just complete nonsense? I would love to see that equation proof written because as you also pointed out averaging the result percentages is so far off "1%" its not even funny.
It's easy once you remember that +24 is actually 1.24, and +15 is actually 1.15, etc.
Once you average that you then get 1.01375.
 
Oh.... i see what he's doing.

Subtract the positive 149 from the negative 92 = a difference of 57 across 40 games, right so divide that 57 difference over those 40 games = 1.425, now that is NOT 1.425%, the decimal point is PAST a whole 1, so its 0.425 past the whole, THAT is 42.5%.

And to prove the equation you add the 42.5% to the 40 which = 57, the number you started with, equation correct, so like i said 42.5% is no where near 1% :)
 
Back
Top Bottom