• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Raptor Lake Leaks + Intel 4 developments

Soldato
OP
Joined
30 Jun 2019
Posts
7,875
Intel said that they were gonna be in for a tough time vs AMD, due to the delays to their 7nm process (under the last CEO). They were originally planning for 7nm chips to be released in 2022, so the generation Intel is releasing in 2022 is the last gasp of their 10nm chips, and not a position they wanted to find themselves in.
 
Soldato
Joined
31 Oct 2002
Posts
9,867
Intel said that they were gonna be in for a tough time vs AMD, due to the delays to their 7nm process (under the last CEO). They were originally planning for 7nm chips to be released in 2022, so the generation Intel is releasing in 2022 is the last gasp of their 10nm chips, and not a position they wanted to find themselves in.

AMD best hope this is the case - as Intel's '7' process (10nm) already got them the performance crown in the vast majortity of benchmarks since 12900k release back in December 2021, complete with a much more modern platform over AM4.

Raptorlake's leaks suggest it'll beat Zen4 across the board as well, while being a few nodes behind AMD.

Imagine Intel actually achieved parity with TSMC's process, or simply started manufacturing the flagship CPU's on TSMC's latest node?
 
Caporegime
Joined
22 Nov 2005
Posts
45,304
Intel said that they were gonna be in for a tough time vs AMD, due to the delays to their 7nm process (under the last CEO). They were originally planning for 7nm chips to be released in 2022, so the generation Intel is releasing in 2022 is the last gasp of their 10nm chips, and not a position they wanted to find themselves in.
you know the chips arent 10nm in size right? and TSMC arent that NM in size too? they are both marketing names only not actual sizes.

intel just renamed theres to try and be more inline with TSMC and comparable.

so alderlake and raptor lake are not 10nm

no one measures in actual nanometres since the 90s

skip to 13:03

Whys intel 14nm almost the same size as AMD 7nm? :p
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
That's irrelevant, it's the same process technology as was used for the 12th generation. No new architecture, or fabrication process, so the 13th gen is an optimization.

Intel's upcoming 7nm EUV process is targeted to have around twice the transistor density of their 10nm process (which they call 'Intel 7').
 
Caporegime
Joined
22 Nov 2005
Posts
45,304
On what metric?
gate size, watch the vid from the time stamp I said and it explains why measuring in nano metres is dumb
That's irrelevant, it's the same process technology as was used for the 12th generation. No new architecture, or fabrication process, so the 13th gen is an optimization.
it''s not irrelevant when your claiming intel7 is 10nm and I'm assuming your trying to imply intel only called it intel7 so they don't seem stuck on old nodes.
which is untrue
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
According to Anandtech, Intel 7 used to be known as 10nm Enhanced Super Fin (e.g. 10ESF). So, it was an optimization of their previous 10nm fabrication technologies. Link here:


The main thing to pay attention to, is transistor density (maybe power /voltage optimizations too), the names aren't really important.
 
Last edited:
Caporegime
Joined
22 Nov 2005
Posts
45,304
yea but it was never 10nm, Intel just rebranded it because TSMC were taking the pee with their naming scheme and getting totally misleading with it.
 
Man of Honour
Joined
30 Oct 2003
Posts
13,260
Location
Essex
There are some absolutely comical power draw figures being thrown about here for consumer grade chips. I've got 64 core epyc chips with smaller TDP requirements. If this is the future I do not like.

Pushing the power envelope rather than the efficiency to "win" given the current climate seems mad and in the end costs us all.
 
Caporegime
Joined
22 Nov 2005
Posts
45,304
There are some absolutely comical power draw figures being thrown about here for consumer grade chips. I've got 64 core epyc chips with smaller TDP requirements. If this is the future I do not like.

Pushing the power envelope rather than the efficiency to "win" given the current climate seems mad and in the end costs us all.
most people would rather have more fps than more efficient fps, obviously theres a balance somewhere but you can likely turn off turbo and have that
 
Man of Honour
Joined
30 Oct 2003
Posts
13,260
Location
Essex
most people would rather have more fps than more efficient fps, obviously theres a balance somewhere but you can likely turn off turbo and have that

There is a limit though. 400w cpus just shouldn't be a thing imo. I thought my 3960x at 280w is silly...
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
There is a limit though. 400w cpus just shouldn't be a thing imo. I thought my 3960x at 280w is silly...
It's not desirable for gamers, especially if it means they have to upgrade their power supply (which is perhaps £100-£160 for a good one). I've got a Seasonic Prime 650w, enough for a RTX 3080 + 8 core CPU it seems, total system power draw is probably 500w or more.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,705
Location
Uk
It's not desirable for gamers, especially if it means they have to upgrade their power supply (which is perhaps £100-£160 for a good one). I've got a Seasonic Prime 650w, enough for a RTX 3080 + 8 core CPU it seems, total power draw is probably 500w or more.
CPUs pull very little power in gaming though especially at 1440p/4K where the higher end users generally are, Gpus though are another story.
 
Associate
Joined
31 Dec 2010
Posts
2,448
Location
Sussex
It's not desirable for gamers, especially if it means they have to upgrade their power supply (which is perhaps £100-£160 for a good one). I've got a Seasonic Prime 650w, enough for a RTX 3080 + 8 core CPU it seems, total power draw is probably 500w or more.
I suspect ATX 3.0 power supplies are going to be a fair bit more than that.
Able to withstand spikes up to double their rating with that 1300W model being able to cope with up to 2600W.

Those are going to be expensive. Still, it allows spikey things like GPUs to save on circuit boards with lots of expensive capacitors.

CPUs pull very little power in gaming though especially at 1440p/4K where the higher end users generally are, Gpus though are another story.
Very little is of course not a constant. That new Spiderman game has reports of really high CPU power draws: 5800X3D at 100W while gaming and 12900K at around 200W.

Sure cinebench probably still draws more, but that game is pretty heavy.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,705
Location
Uk
Very little is of course not a constant. That new Spiderman game has reports of really high CPU power draws: 5800X3D at 100W while gaming and 12900K at around 200W.

Sure cinebench probably still draws more, but that game is pretty heavy.
Most games at 4K pull no more than 50w so I'd be supprised if Spider-Man is drawing that much unless it's bug related.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
What difference does it make if it's 4K or not? A CPU intensive game will still be CPU intensive regardless of the display resolution. It is true that limiting the framerate can help to prevent spikes in CPU utilization, which may cause drops in FPS.

It's easy to max out the cores on strategy games, just by forcing games to handle more units in a battle, or the size of the units (e.g. total war games).

I suppose the reason games don't generally consume as much power as they do in benchmarks, is mostly due to the CPU instructions used (not much AVX/AVX2), lower overall multithreaded utilization, and lower cache utilization in most cases.
 
Last edited:
Caporegime
Joined
22 Nov 2005
Posts
45,304
It's easy to max out the cores on strategy games, just by forcing games to handle more units in a battle, or the size of the units (e.g. total war games).
it doesn't work like that, they don't just scale onto different cores infinitely.

some stuff can go on extra cores sure but they can't equally split everything, one part of the process will always limit the rests speed as well
 
Associate
Joined
31 Dec 2010
Posts
2,448
Location
Sussex
Well, it looks like Spiderman is keeping the e cores busy compiling shaders in real time and this is why the power usage is so high.

The console version had precompiled shaders, but for some reason they decided to change that for the PC.

Obviously a game which actually keeps most cores utilised is going to use more power.

Or: saying Alder Lake's potential high power usage is not a problem for games is only true while games are lightly threaded.
 
Last edited:
Back
Top Bottom