• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Alder Lake-S leaks

Caporegime
Joined
18 Oct 2002
Posts
29,866
The interesting point Intel and AMD have confirmed is the current core count to leverage the majority of today’s performance on a typical build Windows PC is 10-12 cores with around 20-24 threads.

Damn my 5600x is feeling a little inferior all of a sudden!! :D

Seems like Win11 really is going to be the start of a multi-core push!
 
Soldato
Joined
19 Sep 2009
Posts
2,746
Location
Riedquat system
12600K looks pretty good, a little bit better than my 5800X but not much in it really. Cheaper too depending on mobo prices, if I was building a new PC today not sure what I would go with tbh :p
 
Caporegime
Joined
18 Oct 2002
Posts
29,866
12600K looks pretty good, a little bit better than my 5800X but not much in it really. Cheaper too depending on mobo prices, if I was building a new PC today not sure what I would go with tbh :p

Zen3 still very very capable, not worth the change for sure (unless you're a hardcore nutter ofc :D )

Looking forward to checking out the new V-Cache chips coming soon :)
 
Soldato
Joined
31 May 2009
Posts
21,257
12600K looks pretty good, a little bit better than my 5800X but not much in it really. Cheaper too depending on mobo prices, if I was building a new PC today not sure what I would go with tbh :p

I'd skip 12600, I don;t seeit as impressive enough, I'd aim slightly more expensive at the 12700K, which can match 5900x and indeed almost match 12900 in everything
 
Soldato
Joined
20 Aug 2019
Posts
3,031
Location
SW Florida
The windows 11 requirement seems like the biggest downside to me, but people who want the latest and greatest are probably more willing to deal with such things.
 
Associate
Joined
29 Jan 2004
Posts
183
Location
UK
It’s still all energy which needs dissipated in some form, and paid for money wise. There utility company is not going to reduce your bill to account for vrm loss. So it’s valid to talk about the draw at the wall .

yes you are right but thats not what was being talked about here
jay goes on about it going over the pl2 limit when he has set it
when in reality it hasnt he just cant do maths
 
Soldato
Joined
7 Dec 2010
Posts
8,251
Location
Leeds
Why are people arguing about jays power test video ? He clearly didn't test things right and didn't even show the windows power options and what he had set, his idle on ryzen is way too high as anyone that has a ryzen chip will tell you and if you check the video comments people there have hammered him for it. Also if he had the power options the same on both systems, one maybe on balanced and one on performance, same with the Nvidia power settings, setting them to performance or balanced makes a huge difference to your idle power use too. He also never showed if the cores went into idle states with hwinfo or some other tool. It's not a fair test for any of the two systems.


His whole test was wrong from the start and he didn't even know what settings he had in the motherboard bioses as he stupidly showed in the video and didn't even have XMP enabled on the ryzen system at first.. The video is nothing but clickbait for him to get ad revenue from... he really is taking the micky recently with his stupid nonsense videos just to get clicks.

Seriously don't argue about that test video it is flawed in many ways .. lets hope some one does it right or read some real power testing done by respected sites with real equipment for testing and a brain to test correctly.
 
Associate
Joined
24 May 2015
Posts
500
The windows 11 requirement seems like the biggest downside to me, but people who want the latest and greatest are probably more willing to deal with such things.
Windows 10 performance is great, 11 isn't a requirement. Some older productivity apps have problems though. I'm interested in this new 'turbo button' aka the 'Legacy Game Mode' aka the 'Press Scrl Lock to turn off e-cores' feature. Here's my hot take. People run Windows 10 and toggle e-cores on a game-by-game basis.
 
Caporegime
Joined
17 Mar 2012
Posts
47,657
Location
ARC-L1, Stanton System
Why are people arguing about jays power test video ? He clearly didn't test things right and didn't even show the windows power options and what he had set, his idle on ryzen is way too high as anyone that has a ryzen chip will tell you and if you check the video comments people there have hammered him for it. Also if he had the power options the same on both systems, one maybe on balanced and one on performance, same with the Nvidia power settings, setting them to performance or balanced makes a huge difference to your idle power use too. He also never showed if the cores went into idle states with hwinfo or some other tool. It's not a fair test for any of the two systems.


His whole test was wrong from the start and he didn't even know what settings he had in the motherboard bioses as he stupidly showed in the video and didn't even have XMP enabled on the ryzen system at first.. The video is nothing but clickbait for him to get ad revenue from... he really is taking the micky recently with his stupid nonsense videos just to get clicks.

Seriously don't argue about that test video it is flawed in many ways .. lets hope some one does it right or read some real power testing done by respected sites with real equipment for testing and a brain to test correctly.


Its an odd one, Jay is odd.

Its clear when that video started he was expecting to disprove those "ADL double the power" rumours he based that video around, he looked genuinely surprised to see the power figures in Cinebench.

And the other odd thing is how at the start of it all he explained AMD's higher Idle draw as being because that system has a lot more fans and an open loop water pump, which is fair enough because you only need to measure and record the base line to get to the load power draw. But at the end of the video he switched to citing how efficient Intel was at idle and low loads using the AMD system with all that crap in it as a measurable example, WTF is wrong with him???? :cry: Its like he expected the Intel system to draw about the same power as the AMD system under load, realised it does in fact draw twice as much power and thought "oh #### that didn't work, quick find some other way of making Intel looks good, so...... that idle power draw on Intel, good isn't it" :cry:
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
Why are people arguing about jays power test video ? He clearly didn't test things right and didn't even show the windows power options and what he had set, his idle on ryzen is way too high as anyone that has a ryzen chip will tell you and if you check the video comments people there have hammered him for it. Also if he had the power options the same on both systems, one maybe on balanced and one on performance, same with the Nvidia power settings, setting them to performance or balanced makes a huge difference to your idle power use too. He also never showed if the cores went into idle states with hwinfo or some other tool. It's not a fair test for any of the two systems.


His whole test was wrong from the start and he didn't even know what settings he had in the motherboard bioses as he stupidly showed in the video and didn't even have XMP enabled on the ryzen system at first.. The video is nothing but clickbait for him to get ad revenue from... he really is taking the micky recently with his stupid nonsense videos just to get clicks.

Seriously don't argue about that test video it is flawed in many ways .. lets hope some one does it right or read some real power testing done by respected sites with real equipment for testing and a brain to test correctly.

I thought he was the hot shot that everyone seems to portray him as...


Windows 10 performance is great, 11 isn't a requirement. Some older productivity apps have problems though. I'm interested in this new 'turbo button' aka the 'Legacy Game Mode' aka the 'Press Scrl Lock to turn off e-cores' feature. Here's my hot take. People run Windows 10 and toggle e-cores on a game-by-game basis.

Funny. I remember gaming used to be so simple. Fire up a game and done, not closing an air intake to make it run leaner and faster each time going down the straight.
 
Last edited:
Associate
Joined
25 Apr 2017
Posts
1,122
That's the board I was looking at and had originally ordered. Looks like a really nice board but the extra features and aesthetic of the Gigabyte GamingX won it for me considering the exact same price!

Get whatever DDR5 you can get hold of right now, this is all gen 1 DDR5 anyway then in a year's time when DDR5 isn't all high latency stuff, upgrade.

Thanks but I just went DDR4. Went for the Asus TUF Gaming Z690 Wifi D4 and will be pairing it with a 3600 CL16 DDR4 memory and a Noctua NH-U12A air cooler. I know its a gamble as I don't know if future games will benefit more from DDR5 or not but its just way too overpriced when I can get top class Samsung B-Die kits in that ballpark.

Can't wait to get the 12700K and see how it performs with Cyberpunk 2077 and Far Cry 6. My 9900K gets hammered in those titles and this should provide enough room for my 3080 Ti to stretch its legs in these games :D
 
Associate
Joined
24 May 2015
Posts
500
Fire up a game and done, not closing an air intake to make it run leaner and faster each time going down the straight.
It's technically not a performance thing. It's to help games that don't work due to DRM tripping. I can see people using it for performance tweaking though and I imagine game benchmarks will test both options.
 
Associate
Joined
28 Sep 2018
Posts
2,267
Why is that?

DDR4, specifically BDIE can run upto 4200+ in Gear 1. When tuned that will have a lot of bandwidth + low latency.

The current DDR5 kits you can (kinda) buy are the entry level Micron kits which are just plain slow esp when you take into account the latency hit from running the controller at half the memory speed. You can see that in a lot of reviews where they show DDR4 beating DDR5 in gaming and understandably so. Here's one such reviewer that I know personally:

Screenshot_2021-11-05_220237.jpg



So the fix for this is waiting for high end kits to show up from SK Hynix and Samsung. Then DDR5 gets to stretch it's legs and ADL can pull away from DDR4 easily.

To show you an example of what slow vs mid range DDR5 kit performance looks like so you have an idea of what's coming:

This is an entry level Micron Kit manually tuned to max:

DDR5, CL36-37-37-48, tweaked subs, 5.2 ghz all cores
5400ddr5_tweakedsubs_xmpcl36_5200allcores-png.2531052


Here's a SK Hynix: CL36-37-37-28 Command Rate 1. 5.2ghz all core

Pp1ci963_o.png


And remember this is NOT the *really* high end stuff but still quite good.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
18,257
DDR4, specifically BDIE can run upto 4200+ in Gear 1. When tuned that will have a lot of bandwidth + low latency.

The current DDR5 kits you can (kinda) buy are the entry level Micron kits which are just plain slow esp when you take into account the latency hit from running the controller at half the memory speed. You can see that in a lot of reviews where they show DDR4 beating DDR5 in gaming and understandably so. Here's one such reviewer that I know personally:

Screenshot_2021-11-05_220237.jpg



So the fix for this is waiting for high end kits to show up from SK Hynix and Samsung. Then DDR5 gets to stretch it's legs and ADL can pull away from DDR4 easily.

To show you an example of what slow vs mid range DDR5 kit performance looks like so you have an idea of what's coming:

This is an entry level Micron Kit manually tuned to max:

DDR5, CL36-37-37-48, tweaked subs, 5.2 ghz all cores
5400ddr5_tweakedsubs_xmpcl36_5200allcores-png.2531052


Here's a SK Hynix: CL36-37-37-28 Command Rate 1. 5.2ghz all core

Pp1ci963_o.png


And remember this is NOT the *really* high end stuff but still quite good.

Alder lake is very complicated when it comes to memory speed. On one half of the chip you the Skylake clusters with a low latency ring bus and the other the Atom cores in a high latency mesh. I think Alder lake will be performance balancing act within a narrow margin when it comes to memory tuning.

I don’t think the design will suddenly see double digit performance gains from DDR5 7500mhz or dropping down to Cas20 1T.
 
Last edited:
Back
Top Bottom