• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Alder Lake-S leaks

+15% for Zen 3D (games and some other tasks only?). Zen 4 rumoured about 20% IPC increase so more perf with higher clock speed.

Not difficult to believe when Intel's chips will supposedly be 100% quicker than current stuff alone just <4 years from now.

And 500 Watts continuous power?

Not difficult to believe when Intel's chips will supposedly be 100% quicker than current stuff alone just <4 years from now.

Why would you believe that is a fact?

If every performance claim Intel made on launching a new CPU over the past decade + was true their performance would already be 100% higher than it is now, honestly i find it astonishing that every time Intel makes a claim or there is a rumour when it comes to Intel people take it as a given fact already and they do it over and over and over and over.............. never learning from the last 6 times it turned out to be utter male bovine manure, Intel are literally the only semi conductor chip designer who flat out lie about their performance gains from successive CPU's and yet they also seem to be the only ones whose rhetoric people take without a modicum of scepticism, why is that? it boggles the mind.

I'll give you an interesting take on all this, Intel have not stagnated due to AMD not providing competition, AMD made a massive mistake with Bulldozer that almost bankrupted them and that's the only reason Intel are even still in this.
 
Really? That's insane. Still trying to figure out why Intel are calling it "Intel 7" process node, i am guessing its to hide the fact they are not on 7nm yet

no surprise
its called intel 7 as it is being released to compete against 7 nm chips, no matter what process they are using, they compete against the market leader
 
I'll give you an interesting take on all this, Intel have not stagnated due to AMD not providing competition, AMD made a massive mistake with Bulldozer that almost bankrupted them and that's the only reason Intel are even still in this.
You could argue the stagnation started before Bulldozer though. Sandy Bridge was a major smash in the face to AMD, but Phenom II did hold its own. Ivy Bridge made some significant improvements in power draw but was still a quad core. AMD had to move to 6 cores to keep some kind of performance parity but slipped behind against Haswell. And Haswell was still only 4 cores. Devil's Canyon was a tweaked Haswell, so once again 4 cores, but at this point AMD are now in the Bulldozer era and miles behind.

So yes, Broadwell and Skylake cemented Intel's quad core stagnation, but it was Haswell that truly started it. How would things have panned out if AMD stuck with Phenom III instead of Bulldozer?
 
Tested in Aida64 the 12900K was pulling 250 Watts continuous power, 400 Watts peak.

This CPU is on Intel's new "Intel 7" process node, it seems they still haven't figured out how to make a power efficient X86 architecture.

Yep I believe I mentioned it previously that alder lake is at its building blocks just a continuation of the Intel Core architecture introduced in 2006. While AMD is rocking a brand new ground up Zen architecture, Intel is still peddling around with an inefficient deadweight architecture since 2006 https://en.m.wikipedia.org/wiki/Intel_Core


The "new from the ground up , efficient and powerful x86" replacement architecture is due for the 15th (thought it probably won't be called 15th gen and will get a totally new naming scheme)
 
You could argue the stagnation started before Bulldozer though. Sandy Bridge was a major smash in the face to AMD, but Phenom II did hold its own. Ivy Bridge made some significant improvements in power draw but was still a quad core. AMD had to move to 6 cores to keep some kind of performance parity but slipped behind against Haswell. And Haswell was still only 4 cores. Devil's Canyon was a tweaked Haswell, so once again 4 cores, but at this point AMD are now in the Bulldozer era and miles behind.

So yes, Broadwell and Skylake cemented Intel's quad core stagnation, but it was Haswell that truly started it. How would things have panned out if AMD stuck with Phenom III instead of Bulldozer?

That was the mistake, instead of changing the architecture so radically they should have stuck with the more conventional that was Athlon and Phenom, just make them wider, improve them..... other than being MCM Zen is conventional like Athlon / Phenom.
 
This is eye opening

Even though Zen is significantly more efficient than what Intel has right now it can't touch the M1. I think even Apple is surprised how well they did moving to ARM

no x86 can touch Apple's M1 in performance per watt. The M1 has triple the performance per watt compared to this Zen 3 5900hx


Cinebench R23 multi for the 65w 5900hx = 10.6k. Same benchmark for the 15w M1 = 7.7k

So the 5900x puts up 163 points in R23 per watt. While the M1 puts up 513 points in R23 per watt.

 
Last edited:
This is eye opening

Even though Zen is significantly more efficient than what Intel has right now it can't touch the M1. I think even Apple is surprised how well they did moving to ARM

no x86 can touch Apple's M1 in performance per watt. The M1 has triple the performance per watt compared to this Zen 3 5900hx


Cinebench R23 multi for the 65w 5900hx = 10.6k. Same benchmark for the 15w M1 = 7.7k

So the 5900x puts up 163 points in R23 per watt. While the M1 puts up 513 points in R23 per watt.


The m1 strong point is its graphics. The M1X is the one to lookout for though. That looks as if it’s going to bring the pain the to highend laptop market. 5900HS levels of CPU performance with RTX 3070 levels of graphics performance all for 40watts~ I think the 3070 pull 130watts alone.

The MacBooks are going to blow the highend gaming laptops out of the water.
 
Last edited:
This is eye opening

Even though Zen is significantly more efficient than what Intel has right now it can't touch the M1. I think even Apple is surprised how well they did moving to ARM

no x86 can touch Apple's M1 in performance per watt. The M1 has triple the performance per watt compared to this Zen 3 5900hx


Cinebench R23 multi for the 65w 5900hx = 10.6k. Same benchmark for the 15w M1 = 7.7k

So the 5900x puts up 163 points in R23 per watt. While the M1 puts up 513 points in R23 per watt.


The 5900HX is a 45w processor, why didn't they compare to the 5800U 15w processor, everyone knows the more you push a cpu the less power efficient it gets.

This comparison doesn't show anything meaningful.
 
The 5900HX is a 45w processor, why didn't they compare to the 5800U 15w processor, everyone knows the more you push a cpu the less power efficient it gets.

This comparison doesn't show anything meaningful.
Yes, not sure how accurate CPU Monkey is but they have 5900HX scoring 13,875 in Cinebench R23.
https://www.cpu-monkey.com/en/compare_cpu-amd_ryzen_9_5900hx-vs-apple_m1
6xrxxKl.png

they also have the 5800U:
8Qb1yhb.png

And also a preview of a Apple M1X which is rumoured to be a 35W/45W CPU (12 cores though):
z9OZwDM.png

What is more impressive for the M1 is the single core results vs the 5800U:
78LXKJC.png


Far more impressive than the CPU scores, are the M1's GPU scores.
In the AT review:
https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested/3
it made all other iGPUs look really poor:
cAKEofn.png
 
The 5900HX is a 45w processor, why didn't they compare to the 5800U 15w processor, everyone knows the more you push a cpu the less power efficient it gets.

This comparison doesn't show anything meaningful.

Or 35 Watts depending on how its configured.

I'll watch it later......
 
This is eye opening

Even though Zen is significantly more efficient than what Intel has right now it can't touch the M1. I think even Apple is surprised how well they did moving to ARM

no x86 can touch Apple's M1 in performance per watt. The M1 has triple the performance per watt compared to this Zen 3 5900hx


Cinebench R23 multi for the 65w 5900hx = 10.6k. Same benchmark for the 15w M1 = 7.7k

So the 5900x puts up 163 points in R23 per watt. While the M1 puts up 513 points in R23 per watt.



Ok.

First the 5900HX scored about 35% lower than it scores in other reviewers tests, in those tests its running in laptops at 35 Watts, tho its usually configurable to 45 Watts but unless the cooling is really good you don't gain anything or you lose due to heavy throttling.
The 5900HX is actually configurable all the way to 54 Watts, but you would need desktop levels of cooling for that.

Gamers Nexus tested thermals on the unit he had and the temperature peaked at 77c, his immediately hit 91c and began to throttle.

So, as he was very excited to show he had his unit running at 65 Watts, which would be total system power, its using WAY more power than this CPU does in other tests, its running way hotter than the same unit other people test while at the same time scoring about 35% less in performance.

Now moving on to his testing the Apple machine, notice how he was always showing the power consumption of the AMD machine but never showed you what the power draw was on the Apple machine?
What he did was take the actual power draw from the AMD machine, which i have no doubt he set way too high for the box, was total package power, and it was scoring way lower than it should be, and then compared that Apples claimed power consumption of the CPU alone. Not the actual power draw but Apples on the box claims of the CPU alone.

The Apple M1 is a very power efficient CPU, but this idiot did everything he could to deliberately make the AMD's system draw as much power as he could, about 70% more power than it should have been and score as low as he could get it, about 35% lower than it should be, i don't know why he did that but the video right from the start is nothing but "Look amzeeballs Apple Mini PC crushing everything!!!!!!!!!!!!!!!!!!!!!!!!!!" hyperbole.

Its the equivalent of putting a GPU in a sweatbox and screaming "look how hot this GPU is running"
If you want to know what a shill crank looks like take a good look at this guy.
 
Last edited:
And I thought Austin Evans set the benchmark...

The world of console reviewers is like a literal war of Shills and Apple are absolutely involved with that, its cut throat and full of very obvious (to people who understand the hardware at least, IE us) Male Bovine Manure coming out of very large and very well paid channels.
The only reason it works in that world is because the people who buy consoles and under your TV mini PC's don't understand the hardware at all, the fact that we are enthusiasts of the hardware is why the people we watch don't do it.

Posting that #### in here is insulting.
 
I literally spat tea all over my Macbook Air when Mr. Evans proclaimed "the new PS5 has a hotter exhaust, so it's OBVIOUSLY worse than the launch model". But we're properly digressing now ;)
 
I literally spat tea all over my Macbook Air when Mr. Evans proclaimed "the new PS5 has a hotter exhaust, so it's OBVIOUSLY worse than the launch model". But we're properly digressing now ;)

No we aren't :) It just means the cooling is more efficient at removing the heat.
 
It just means the cooling is more efficient at removing the heat.

nope that is incorrect as well

Watch the gamers nexus video = hotter exhaust at the back of the machine doesn't mean the soc is necessarily cooler. You cannot make that conclusion it's extremely inaccurate (these thermal cameras are so inaccurate that the light in your room coming through your window will throw off the readings - even moving the console a few cm across your desk will give different readings - so the fact he had the 2 PS5's on a different spot on the desk automatically invalidates the readings among the other 100 variable he did not account for or calibrate for)

Thermal cameras like FLiR should never be used to make a conclusion about the temperature of anything unless they are used in an industrial setting where every possible variable is controlled and calibrated - any reading that does not comply to these standards is rubbish.

This is just influencer tech kiddies like evans buying cheap $200 cameras and thinking they can make scientific conclusion with a device that have no idea how to use and have never been certified trained to use one

 
Last edited:
Don't know if this was posted earlier but:

akPVWQz.png


9MKgI2a.png


https://www.techpowerup.com/286638/first-tentative-alder-lake-ddr5-performance-figures-leak

785 points in CPU-Z. Apparently it's a 12600k. Latency is crazy high but this is likely due to not being in gear 1 mode.


Th Aida readings aren't impressive- double bandwidth compared to ddr4 but also double latency = so net effect for gaming will be no change. Anything that doesn't care about latency wil enjoy the double bandwidth
 
nope that is incorrect as well

Watch the gamers nexus video = hotter exhaust at the back of the machine doesn't mean the soc is necessarily cooler. You cannot make that conclusion it's extremely inaccurate (these thermal cameras are so inaccurate that the light in your room coming through your window will thr

Thermal cameras like FLiR should never be used to make a conclusion about the temperature of anything unless they are used in an industrial setting where every possible variable is controlled and calibrated - any reading that does not comply to these standards is rubbish.

This is just influencer tech kiddies like Evans buying cheap $200 cameras and thinking they can make scientific conclusion with a device that have no idea how to use and have never been certified trained to use one


Oh please..... this is just typical Steve Burke "please don't have a go at reviewers for doing the wrong thing, its not nice" virtue signalling crap.

People didn't have a go Evans for his flawed testing methodology they had a go at him for his ridiculous conclusions, which Steve himself said are incorrect to put it in his polite way.

And frankly yes i agree with the crap that Eveans got for it, pretending to be an expert and making factual conclusions based on that pretence and getting it so wrong expect to get ridiculed for it because there is not much worse than incompetent idiots pretending to be know it all's and being so wrong, the internet is full of self proclaimed experts who have no idea what they are talking about or what they are doing but they do it in the hope that you are even more stupid than they are, or they assume it to start with, and people like that are the worst kind of shill.
 
Back
Top Bottom