• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

ok i not talking about AMD here its about INTEL ... i asked specialy for 14700K or maybe 265k in future if we will be fixed something who will be good in gaming in future :)

14700k is very good but needs a bit of tuning. In gaming a tuned 14700k +ddr5 is 15-20% faster than stock + xmp while running cooler and drawing less power in cpu limited scenarios.

YMMV on your tuning knowledge.
 
  • Like
Reactions: G J
7950X3D cost me more than new Z890 and 14700K

Wait..... you're getting a Z890 motherboard and a 14700K for less than the price of a 7950X3D and you're worried about your existing B650 board not having the VRMs to support a 9900X?

What is this Z890 made out of for you to price it up in this way, Lego? let me see it....
 
Last edited:
ok i not talking about AMD here its about INTEL ... i asked specialy for 14700K or maybe 265k in future if we will be fixed something who will be good in gaming in future :)
I would go 265K but I accept the risk just to play with something new and hated. Also, the E-cores are faster which should be better for the stuff I do and the 14xx chips may have degradation issues.
 
so i was just looking the spec's and intel as removed hyper threads, in favor of Ecores?
or are we expecting more CPU's was HT?

i mean £290 for a 6core CPU...
i used to have a 12400 and i swapped it out for a 13500 for E cores and they made zero differance for gaming.
 
so i was just looking the spec's and intel as removed hyper threads, in favor of Ecores?
or are we expecting more CPU's was HT?
Afaik the explanation from Intel is that removing HT from the P-Cores in a hybrid config allows them to optimise the CPU design and scheduler better (to get more performance and power efficiency from the P-Cores), while the E-Cores do the work in heavy multithreading that the extra threads would have done (in the older designs that used HT).
 
Last edited:
14700k is very good but needs a bit of tuning. In gaming a tuned 14700k +ddr5 is 15-20% faster than stock + xmp while running cooler and drawing less power in cpu limited scenarios.

YMMV on your tuning knowledge.

Actually a fair bit to get out of a 14700K with tuning - but it is also a ton of faff :s I think how generally meh the 14th gen is and the reviews setting the tone has overshadowed the 14700K, especially as it is fairly well priced currently. Real world failure rates so far seem to be around 1% not 20%, 50%, 100% or whatever some YT channels are peddling - whether there is a longer term sting in the tail regarding the Intel issues remains to be seen but they certainly aren't dropping like flies in reality - from everything I've seen so far it seems a small number of CPUs which are displaying aberrant voltage behaviour (if you know what to look for) that are likely the ones which will develop issues.
 
Last edited:
The issue is, if you see fps well above 100 in CP2077 you know the scene selection is complete crap. Can't trust a thing about the review because they're clearly clueless.

See here in real time the fps in a real scene with proper settings: https://www.eurogamer.net/digitalfoundry-2024-intel-core-ultra-9-285k-ultra-5-245k-review?page=4

Even the mighty 7800X3D can't do it.
Interesting. How do they whizz around like that and make the benchmark consistent in CB2077? Good that they show the sequence though.
 
I don't know why they included an NPU, should have skipped that. Overall, I think its not as bad as people are making it sound. Some of the biggest problem's are obvious scheduling issues because of the new core layout, this should have never made it past QA. After the scheduling issues are fixed and the price drops, it should be a good option. The 265K and 245K could offer very good price/performance if priced right as AMD's low/midrange is not great.

I think this release has raised many questions. Especially to someone like me, who doesn't normally follow CPU development. I would just buy the latest, because it was clearly better than the last generation.

With this generation, though, one of my first questions is, what are the real world numbers here? I mean, most of these tests are at 1080p, but that is completely unrealistic. Who plays at 1080p these days? Running tests at 1080p may highlight the CPU performance but what's the point if no one plays at that resolution? That margin closes when we get nearer to resolutions that people actually play at. There are some results that put it well behind the last generation, but until we know WHY, it's difficult to judge.

Yep, it seems to me there is a real danger that these CPU's will be totally written off when in fact they might not be as bad as they first appear to be. I dunno, maybe that's me just desperately trying to save a sinking ship!!??

We also have to look at gaming CPU design in general. Have we just reached an apex with gaming CPU's? Can we just not expect year on year gains now? Is gaming important to intel? I mean the CPU is more powerful than the last generation in business applications, does gaming matter? Do intel need gaming centric CPU's?

So many questions.
 
I think this release has raised many questions. Especially to someone like me, who doesn't normally follow CPU development. I would just buy the latest, because it was clearly better than the last generation.

With this generation, though, one of my first questions is, what are the real world numbers here? I mean, most of these tests are at 1080p, but that is completely unrealistic. Who plays at 1080p these days? Running tests at 1080p may highlight the CPU performance but what's the point if no one plays at that resolution? That margin closes when we get nearer to resolutions that people actually play at. There are some results that put it well behind the last generation, but until we know WHY, it's difficult to judge.

Yep, it seems to me there is a real danger that these CPU's will be totally written off when in fact they might not be as bad as they first appear to be. I dunno, maybe that's me just desperately trying to save a sinking ship!!??

We also have to look at gaming CPU design in general. Have we just reached an apex with gaming CPU's? Can we just not expect year on year gains now? Is gaming important to intel? I mean the CPU is more powerful than the last generation in business applications, does gaming matter? Do intel need gaming centric CPU's?

So many questions.
1080p is used to remove GPU bottlenecks and show the real performance difference between the CPU without the GPU being the limiting factor. Think of it like this 1080p shows a difference with CPU A as faster and CPU B slower, 4k doesn't show much of a difference due to being limited by the GPU. Later on a new GPU comes out that is not limited at 4k in that game. Now CPU A pulls massively ahead at 4k and CPU B falls behind at 4k matching the 1080p results.

Its early days but I suspect Arrow Lake might not as bad as it appears. Once the software bugs are fixed it might be good. 25% ish performance boost from changing windows default power profile to high performance, 20% FPS boost from changing cyberpunk core usage priority to P core over E core now your potentially at or above 14900k speeds at better power efficiency. Like I said early days but its starting to look like the problem is not the CPU but power and core priority being incorrectly assigned via software and that's very fixable.
 
Last edited:
I think this release has raised many questions. Especially to someone like me, who doesn't normally follow CPU development. I would just buy the latest, because it was clearly better than the last generation.

With this generation, though, one of my first questions is, what are the real world numbers here? I mean, most of these tests are at 1080p, but that is completely unrealistic. Who plays at 1080p these days? Running tests at 1080p may highlight the CPU performance but what's the point if no one plays at that resolution? That margin closes when we get nearer to resolutions that people actually play at. There are some results that put it well behind the last generation, but until we know WHY, it's difficult to judge.

Yep, it seems to me there is a real danger that these CPU's will be totally written off when in fact they might not be as bad as they first appear to be. I dunno, maybe that's me just desperately trying to save a sinking ship!!??

We also have to look at gaming CPU design in general. Have we just reached an apex with gaming CPU's? Can we just not expect year on year gains now? Is gaming important to intel? I mean the CPU is more powerful than the last generation in business applications, does gaming matter? Do intel need gaming centric CPU's?

So many questions.

51% of Steam users are on 1080P screens, 240Hz, even 360Hz screens are readily available, there are a lot of people who run 1080P high refresh rate screen and they would be looking for the most performant CPU, who plays at 1080P these days? E-Sports players.

Outside of that 1080P CPU testing is valid for people looking for future proofed CPU's, they might have one more or even two more GPU upgrade cycles for ever one CPU, i've run 3 different GPU's on my 5800X and for that i'm glad of 720P testing because i knew at the time the CPU was capable of driving GPU's very much more powerful than what was available at the time, even if i do game at 1440P, 720P on a 2080 Ti is like 1440P on a 3090.

These are the very same arguments AMD loyalists used to make when Bulldozer couldn't keep up with anything from Intel, exactly the same arguments, those same people don't make those arguments now and Intel loyalists didn't make those arguments then.
 
Last edited:
Afaik the explanation from Intel is that removing HT from the P-Cores in a hybrid config allows them to optimise the CPU design and scheduler better (to get more performance and power efficiency from the P-Cores), while the E-Cores do the work in heavy multithreading that the extra threads would have done (in the older designs that used HT).
It the power for 16 low power cores as to build up… I mean what would 16 e cores be the same as with normal cores…

Would a 10 or 12 core cpu not have just been better. (For gaming)

I personally think it’s the massive number of e cores intel is using the puting power usage through the roof.
The 285k in gaming mode pulls 300w. How can they say it’s better than a 14900k when it pulls less power but dose less work.

For me intel need to abandon the e cores idea it obviously doesn’t work

EDIT: i just found this statment, its from the 13th/14th gen but i assume the same for new CPU's
E-cores are not used for gaming because it doesnt contain the necessary instruction sets. An E-core is not a CPU core because it cannot run a computer on its own. An E-core is an accelerator.
so if this stands for the new CPU's then the £290 245K is a 6 core CPU, with 8 accelerator's
its about 8% faster than a 14400f at about double the cost(in games)
 
Last edited:
Back
Top Bottom