• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Alder Lake-S leaks

Caporegime
Joined
17 Mar 2012
Posts
47,578
Location
ARC-L1, Stanton System
I ######### hate that reasoning and attitude.

Steam says 8 cores or less is what most people buy, so lets just keep making 8 cores or less, its self perpetuating.

Its why we needed AMD to torpedo Intel's deliberate stagnation and we continue to need them to brute force things forward.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
The 10400F is for something up to a 3060TI but for the 12400F it depends on the overall cost, i think it will go to £200, motherboard another £200 and DDR5 another £200.

That's £600, even with the 5600X being overpriced right now you can still get it with a half decent platform for about £450.

It could handle a 3090 too tbh, but yeah it's going to come down to total cost. For a while the 11400f was almost half the price of the 5600x - that's hard to beat!

https://www.capframex.com/tests/The true star of the 11th gen
 
Soldato
Joined
30 Jun 2019
Posts
7,875
Well 10nm Sapphire Sappids will probably get upto 64 cores. Servers get the best tech as always. If you were Intel, deciding which CPU lines should more cores, what would you chaps pick?

There's always AMD for those who want that 1% performance increase (or for the thing known as work) with 16 cores or more :cry:
 
Soldato
Joined
28 May 2007
Posts
18,243
Rocket Lake was a new microarchitecture intended for 10nm but backported to 14nm.

Alder Lake is another new microarchitecture that was designed for Intel's 10nm node. Other 10nm CPU series on laptops like Ice Lake and Tiger Lake haven't had issues with core to core latency, no reason why the 12th gen would.

Not really. Alderlake is a new architecture in the sense it’s two old architectures in one.

In simple terms Intel have a taken a pair of 4c clusters and linked together. Similar to what AMD done with the 1800X. That chip is then linked to the another cluster of 8 Atom cores.

The interconnects between the clusters will add latency. The mixed topology of two different bus types will also add latency. Essentially Intel have three chiplet design compared to say the 5900X that has two.

How the IO is connected could also add latency. 1 channel memory link to each 4c chip with 8x lanes of PCIE direct per chip. The 8 Atom cores could have the NVME, SATA connected through a northbridge and linked to the 2x4c clusters trough a crossbar for example. So if core 0 (big core) wants to access data on the NVME it would have to make its way around the three other cores, over the 4c interconnect, though a crossbar, through the mesh of Atom cores, over the northbridge and to the NVME drive.
 
Soldato
Joined
28 May 2007
Posts
18,243
I think you all need to look at this, to understand Alder Lake and what Intel is doing better:
https://store.steampowered.com/hwsurvey/cpus/

Over 97% of CPUs this year (at least for gamers) have 8 or less cores.

Current gen consoles have 8 CPU cores.

The magic number is 8, it is written.

1.07% have 12 core CPUs.

It could be that 8 small cores will have roughly equivalent performance to 4 large cores, for example. IF things go well, that is. That may explain the 'logic' of smaller cores. That, and they are scalable. And marketing reasons, e.g. 'more core make CPU go brum'

97% of Chinese gamers that take the steam survey. They also game at 1280x720.
 
Soldato
Joined
28 May 2007
Posts
18,243
We'll see won't we. How does the core config of Alder Lake compare to 8 / 12 core Comet Lake CPUs? These chips have decent latency.

They are a little different. Think of it like this. 6700K + latency + 6700K + latency + Atom. With maybe a crossbar mixed in.
 
Caporegime
Joined
17 Mar 2012
Posts
47,578
Location
ARC-L1, Stanton System
It could handle a 3090 too tbh, but yeah it's going to come down to total cost. For a while the 11400f was almost half the price of the 5600x - that's hard to beat!

https://www.capframex.com/tests/The true star of the 11th gen

At 4K sure, a potato could drive a 3090 at 4K, but what if a 4090 will actually be 80% faster? then what? just spend the £600 of the 5800X platform then? After you spent £400 on the 11400F platform?

And again are we just going to ignore eSports gamers? Or any one who wants high refresh rates?

Back in the Bulldozer days all reviewers benchmarked the at low resolution and low settings, arguing this shows the real difference between CPU's, to properly inform people.

All the AMD fanboys hated it arguing "but no one plays games at those resolutions and its fine at resolutions people do play games at" which at the time was true.

These days people are listening to those arguments, only this time its the Intel lot who are happy to see CPU performance in reviews be strangled by the GPU being the bottleneck. Which is what the Bulldozer AMD fanboys wanted.

Its like i'm trying to get people to understand the madness behind the methods all over again only this time its because the current method is hiding something rather than show it for what it is.
 
Caporegime
Joined
17 Mar 2012
Posts
47,578
Location
ARC-L1, Stanton System
Jayz2Cents recently put out a video revisiting the Bulldozer CPU's, using a lower end GPU, 1660 i think it was, to see if it could still game as a low end system.

It could, using several popular games like Doom Eternal ecte.... the old Bulldozer CPU was still pulling 60 to 100 FPS no problem.

His conclusion was its really not bad at all as a lower end GPU driver, plenty capable.

I haven't been back to the video but i bet its full or people screaming "now do it with a higher end GPU and see how bad it is"

Which about a week later is exactly what Hardware Unboxed did, like an absolute ####!
 
Caporegime
Joined
17 Mar 2012
Posts
47,578
Location
ARC-L1, Stanton System
I play competitive shooters at high refresh rates, even with my 2070S and a 75Hz screen, i'll turn the IQ to low to get a steady 200+ FPS, 300+ FPS..... as high as i can get them and yes i can feel the difference.

Lets not pretend i'm alone in that.
 
Soldato
Joined
28 May 2007
Posts
18,243
I’ve just got a high refresh rate screen without any quality control issues. Not an easy task! Just need to prioritise the internal network now and see what type of ping I can get. Some autumn and winter gaming might be on the cards.
 
Back
Top Bottom