• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

These guys have no idea what they are talking about.

Sure you can, im just saying the results you get are flawed. Higher performance usually results in higher power consumption so obviously the cpu that performa better will seem like its less efficient when its not the case. That's why you need to normalize for something
But if you are doing that it is irrelevant because it is not general user case information. Intel and AMD designed their chips at their power rating to give the performance they do and so what I want to know is when I put my CPU into my mobo and play a game what is the most power efficient CPU relative to the performance I am getting as a consumer. Not what is the most efficient at a specific setting, with a specific wattage, for a specific game.

It is only flawed if you are someone looking to set an efficiency figure ignoring the general performance of said chip. With that at no point if you pulled the performance of the 12900k to the 5800x would it suggest that it would match that performance either. However since you have such chip why don't you lock it to 65watt of the 5800x, run the games igorslab has run and then do a comparison of the figures.

That would give the answer to most efficient for gaming whilst normalising the power consumption then. I don't expect you will get the same result as what the 5800x tbh. However would be good to see. I don't have a chip to compare results to anything and since my 5950x uses less power than the 12900K in the table then I wouldn't be able to normalise the power consumption. I don't have all the games but I could compare some of the ones listed and if I get more FPS whilst using less wattage then what?
 
This might be one of the most boring threads I've seen so far in the CPU section and that's saying something. Carried by someone who numerous people have pointed out as a troll yet they continue to reply to his obvious bait.
 
Well sure but that doesn't change the fact that the 12900k should be way faster than the 12600k at same wattage. Right?
Wouldn't it have to use more power to be faster? Either feeding more cores or running them at higher frequency, unless you're saying the 12900K has more efficient cores than the 12600K i.e. they can do more work with the same power?
 
Wouldn't it have to use more power to be faster? Either feeding more cores or running them at higher frequency, unless you're saying the 12900K has more efficient cores than the 12600K i.e. they can do more work with the same power?
Im saying that in a mt workload, more cores at same wattage should perform better. For example, the 5950x will vastly outperform the 5800x at the same wattage in cbr23. The same should apply to the 12900k compared to the 12600k, yet TPU doesnt show that, which means their testing is flawed.
 
But if you are doing that it is irrelevant because it is not general user case information. Intel and AMD designed their chips at their power rating to give the performance they do and so what I want to know is when I put my CPU into my mobo and play a game what is the most power efficient CPU relative to the performance I am getting as a consumer. Not what is the most efficient at a specific setting, with a specific wattage, for a specific game.

It is only flawed if you are someone looking to set an efficiency figure ignoring the general performance of said chip. With that at no point if you pulled the performance of the 12900k to the 5800x would it suggest that it would match that performance either. However since you have such chip why don't you lock it to 65watt of the 5800x, run the games igorslab has run and then do a comparison of the figures.

That would give the answer to most efficient for gaming whilst normalising the power consumption then. I don't expect you will get the same result as what the 5800x tbh. However would be good to see. I don't have a chip to compare results to anything and since my 5950x uses less power than the 12900K in the table then I wouldn't be able to normalise the power consumption. I don't have all the games but I could compare some of the ones listed and if I get more FPS whilst using less wattage then what?
If i normalize for wattage the 12900k will poop all over zen 3 in gaming efficiency. I mean i can test it if you want me to, im just saying it wont even be close.

At 65w the 12900k at any game will scorch the 5800x 3d, even if you set the 3d at 150 watts. You can pick a game and we can test it if you want
 
Im saying that in a mt workload, more cores at same wattage should perform better. For example, the 5950x will vastly outperform the 5800x at the same wattage in cbr23. The same should apply to the 12900k compared to the 12600k, yet TPU doesnt show that, which means their testing is flawed.
Ok so for example 16 cores at 80W total would be faster than 8 cores at 80W total? Same core design but obviously the 16 cores running at lower frequency. Are you saying that as the cores are pushed harder they become less efficient i.e. on a curve and not linear? Hence the 16 cores could work at higher than half the frequency of the 8 cores thereby doing more work for the same power budget.
 
Ok so for example 16 cores at 80W total would be faster than 8 cores at 80W total? Same core design but obviously the 16 cores running at lower frequency. Are you saying that as the cores are pushed harder they become less efficient i.e. on a curve and not linear? Hence the 16 cores could work at higher than half the frequency of the 8 cores thereby doing more work for the same power budget.
Yeah, exactly. Lower clockspeeds = more efficient. Generally speaking anything running above 4ghz is way outside the efficiency curve.
 
If i normalize for wattage the 12900k will poop all over zen 3 in gaming efficiency. I mean i can test it if you want me to, im just saying it wont even be close.

At 65w the 12900k at any game will scorch the 5800x 3d, even if you set the 3d at 150 watts. You can pick a game and we can test it if you want
Did you even bother to read the original link I used which didn't show that at all. Not even close with the 12900k being about 40% less efficient than the 5800x3D and lower performance average still? The games are in Igors information.

The point is the 12900k isn't when normalised at the wattage Ryzen already runs at (65watt) in gaming providing the same performance. That is literally shown in the link I provided on multiple games.
 
Did you even bother to read the original link I used which didn't show that at all. Not even close with the 12900k being about 40% less efficient than the 5800x3D and lower performance average still? The games are in Igors information.

The point is the 12900k isn't when normalised at the wattage Ryzen already runs at (65watt) in gaming providing the same performance. That is literally shown in the link I provided on multiple games.
Pick a game, ill upload you a run at 65w and we go from there. Deal?
 
Im not going to run them all man, thats why i asked you to pick one so you couldnt complain im cherrypicking. Is farcry 6 okay with you?

You can pick what you want tbh. Yeah Ubisofts engine significantly favours Intel but would still be interesting to see and yes the 3D beating the 12900k as pretty mental to see because nobody though AMD would manage that with such a chip. But I mean you need to probably run a half dozen games to get an idea of the average, that is the point of averaging.

One game can always swing the other way. Is it like F1 2022 that is mental AMD direction? I can't remember so that would hit really hard on Intel compared to say FarCry. But yeah the point being that the review has multiple games because one game/bench doesn't give a total picture. An average is needed. Of course the more the better but that means more time.
 
You can pick what you want tbh. Yeah Ubisofts engine significantly favours Intel but would still be interesting to see and yes the 3D beating the 12900k as pretty mental to see because nobody though AMD would manage that with such a chip. But I mean you need to probably run a half dozen games to get an idea of the average, that is the point of averaging.

One game can always swing the other way. Is it like F1 2022 that is mental AMD direction? I can't remember so that would hit really hard on Intel compared to say FarCry. But yeah the point being that the review has multiple games because one game/bench doesn't give a total picture. An average is needed. Of course the more the better but that means more time.
Just run the in game FC6 benchmark at 25w with igor'slab settings. Got an average of 114 and a minimum of 85 fps. That's 4.56 frames per watt, compared to 2.23 that the 3d gets with the 3090ti.
 
Tested it again at 65 watts, 142 average and 105 minimum. That's 2.23 fps per watt, exactly the same as the 3d gets


Tested one last time with e cores on at 65w, surprisingly it does way better. Peak wattage was 61 and fps was 154 average and 116 minimum. I don't know why but with e cores on it doesn't hit the power limit
 
Last edited:
Star Citizen is the highest gaming load i have for my 5800X, about 65 watts with clocks at 4.9Ghz about 90% of the time.

Most other games its about 20 to 40 watts at 4.9Ghz.
Yes my 3D gets run very hard with SC, seems to use whatever you throw at it. May improve with more optimisation? Have you noticed any difference with 3.17.2?
 
Back
Top Bottom