• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

You can get the exact same indication from the 1080p benchmarks, just to a lesser degree the higher up the resolution scale you go. It's really not that difficult to understand. the advantage of 1080p benchmarks is you test raw CPU performance, which is the entire, central, focal point of a CPU review.

Just saying utlisation doesn't matter doesn't make it so.

You compare your current CPU (or equivalent) to a contemporary CPU. Is the new one faster in 1080p benchmarks? Are you GPU limited? If yes to the latter, don't upgrade. CPUs do not give you more frames, but they can restrict your GPU. This is why util stats help. Again, really not that difficult to understand.



Once more, say it with me; 1080p benchmarks test the raw CPU performance. Higher resolutions are influenced by the GPU which is less informative unless you have the same card as the reviewer. With a 1080 a AL or Zen5 CPU isn't going to do much if anything for you. I don't need a 4k benchmark to tell me that. If you can't work out what to buy then that is on you, not CPU testers.

Honestly, this is just willful misunderstanding at this point.
1080p is major factor but it’s not the only useful focal point. Which is what you seem to be missing. If I am not replacing the GPU and only keeping the CPU a year or two, I don’t care or need to know which CPU is better at 1080p. What matters is the best cost to performance ratio CPU as 1440p and 4k for my current GPU. Looking at 1080p benchmark only or GPU utilisation doesn’t tell me which is the best CPU to get for 1440p gaming in that situation.

At the moment I am working out if I want to keep my current GPU and get a new CPU. Or get a new CPU with a new GPU. Also do I want to keep the CPU for 2 years and build a full new system in 2 years or buy a higher end CPU and keep the system 5+ years with a later GPU upgrade in-between.
Looking at only GPU utilisation or looking at only 1080p benchmarks does not asnwer any part of this question. I need the 1440p and 4k benchmarks.

If we take theoretical CPU A and CPU B and say there is 30% difference at 1080p benchmarks but a 1% difference at 1440p and 4k and there is a £300 price difference between the CPU’s. Then looking at the 1080p benchmarks as the entire focal point is not going be helpful by itself. If I am going to keep the CPU long term with a later GPU upgrade then the most expensive faster 1080p CPU is the better choice. If I am going change the CPU sooner and/or not upgraded the GPU this cycle then cheaper slower CPU is the better choice. I cannot work that out without the 1080p benchmark alongside the 1440p and 4k benchmarks to see that both CPU's are within 1% of each other. The 1080p data matters more but 1080p is not enough by itself as I also need the extra 1440p and 4k benchmarks.


“Just saying utlisation doesn't matter doesn't make it so.”
I didn’t say it doesn’t matter at all, it very much does so. What I said is it doesn’t matter in isolation. It’s a far more useful measurement when combined with other data points. By itself it can be misleading. I cannot decide on an upgrade via utilisation just by itself that doesn't work. I made that mistake before. I upgraded a GPU for better RT power expecting a diffrent none RT game to get far more FPS as a side effect of the upgrade because the utilisation method suggested that what would happen. Yet I gained 2FPS because the utlisation method can be misleading in isolation.


“You compare your current CPU (or equivalent) to a contemporary CPU. Is the new one faster in 1080p benchmarks? Are you GPU limited? If yes to the latter, don't upgrade. CPUs do not give you more frames, but they can restrict your GPU. This is why util stats help. Again, really not that difficult to understand.”
Its not that simple. Am I GPU limited: yes. But upgrading my CPU will give a large speed increase to the main game I am playing right now despite being GPU limited in most of my games. In reality I need to upgrade my CPU but your saying I shouldn't. Stellaris is a good exmple how fast that game plays has nothing to do with how GPU limited you are but comes down entirely to the CPU. You can very much be in a situation of "Are you GPU limited? If yes to the latter, don't upgrade" only you should upgrade. In Stellaris even when GPU limited the speed the game plays and time ticks over is entirely down to the CPU so a CPU upgrade is always useful even when GPU limited.
 
If we take theoretical CPU A and CPU B and say there is 30% difference at 1080p benchmarks but a 1% difference at 1440p and 4k and there is a £300 price difference between the CPU’s. Then looking at the 1080p benchmarks as the entire focal point is not going be helpful by itself. If I am going to keep the CPU long term with a later GPU upgrade then the most expensive faster 1080p CPU is the better choice. If I am going change the CPU sooner and/or not upgraded the GPU this cycle then cheaper slower CPU is the better choice. I cannot work that out without the 1080p benchmark alongside the 1440p and 4k benchmarks to see that both CPU's are within 1% of each other. The 1080p data matters more but 1080p is not enough by itself as I also need the extra 1440p and 4k benchmarks.

You answered your own dilemma there though - if you need the faster CPU for the long term, you choose the better one by the 1080p benchmarks. If you're going to buy another one in a year, buy according to your budget using the same benchmarks. There is no instance where a £300 cheaper CPU is worse at 1080p and better than a £300 more expensive one is at 4k, which is the only way your dilemma would exist.

Its not that simple. Am I GPU limited: yes. But upgrading my CPU will give a large speed increase to the main game I am playing right now despite being GPU limited in most of my games. In reality I need to upgrade my CPU but your saying I shouldn't. Stellaris is a good exmple how fast that game plays has nothing to do with how GPU limited you are but comes down entirely to the CPU. You can very much be in a situation of "Are you GPU limited? If yes to the latter, don't upgrade" only you should upgrade. In Stellaris even when GPU limited the speed the game plays and time ticks over is entirely down to the CPU so a CPU upgrade is always useful even when GPU limited.

Lol, if you're CPU limited in one game then you're CPU limited in one game. Don't misrepresent what I said. That nugget of info reinforces what I said - if you're GPU limited, don't bother. You're not GPU limited in this one mystery game. What game are you talking about? Is it Stellaris or a 4X/sim? Because if so, 4k benchmarks aren't any help there either. Again, if you're CPU bottlenecked, figure out your budget and buy the fastest one. 1080p benchmarks give you raw CPU performance.
 
Last edited:
How many people are still on DDR4 platforms despite upgrading their GPU once or twice? They probably saved enough money to buy a brand new higher performance GPU by not having to upgrade to DDR5 because they were well enough informed about the true performance off their CPU when they bought it.

The best gaming CPU's are not the most expensive CPU's so its a good thing we can see that, except perhaps if you're Intel. Hey? Wouldn't want to be fooled by marketing and a £600 price tag, right?
 
Last edited:
Reality is though, by the time the 1080p performance of these CPUs becomes relevant the vast majority of people who are either using GPUs which won't benefit from that CPU performance or playing at resolutions where it is less important will be moving on anyhow. Fact is many CPU reviews don't cover enough of the broad picture while they'll happily benchmark 45 games at 1080p...

Also pure gaming is one thing, some of these X3D CPUs aren't ideal if you also do a bunch of related activities like streaming/OBS, etc. at the same time as playing.
 
Reality is though, by the time the 1080p performance of these CPUs becomes relevant the vast majority of people who are either using GPUs which won't benefit from that CPU performance or playing at resolutions where it is less important will be moving on anyhow. Fact is many CPU reviews don't cover enough of the broad picture while they'll happily benchmark 45 games at 1080p...

Also pure gaming is one thing, some of these X3D CPUs aren't ideal if you also do a bunch of related activities like streaming/OBS, etc. at the same time as playing.

With this mentality i would now have a CPU more expensive than what i bough 4 years ago and slower, even at 1440P with the GPU i now own.

This is the point, all these companies market their products in a way that makes them look better than they actually are and people being what they are automatically default to most expensive = best, for serval generations now that has not been true, that is unless Intel are competing with Intel, they aren't.

So with that said this weird raging against 1080P CPU benchmarks is in my humble opinion nothing more than corporate boot licking, 10 years ago the corporate boot lickers came out in force for AMD, now its Intel.

Yeah, Intel can't get away with charging £600 for slower than much cheaper CPU's, because these pesky reviewers do 1080P benchmarking, boo hoo... the only people hurting are Intel, any consumer is properly informed.
 
Last edited:
With this mentality i would now have a CPU more expensive than what i bough 4 years ago and slower, even at 1440P with the GPU i now own.

This is the point, all these companies market their products in a way that makes them look better than they actually are and people being what they are automatically default to most expensive = best, for serval generations now that has not been true, that is unless Intel are competing with Intel, they aren't.

So with that said this weird raging against 1080P CPU benchmarks is in my humble opinion nothing more than corporate boot licking, 10 years ago the corporate boot lickers came out in force for AMD, now its Intel.

Yeah, Intel can't get away with charging £600 for slower than much cheaper CPU's, because these pesky reviewers do 1080P benchmarking, boo hoo... the only people hurting are Intel, any consumer is properly informed.

It would also be impossible to ‘benchmark’ any CPU and you’d instead be ‘benchmarking’ an individual complete system which would make a CPU benchmark pretty pointless.
 
It would also be impossible to ‘benchmark’ any CPU and you’d instead be ‘benchmarking’ an individual complete system which would make a CPU benchmark pretty pointless.

You don't have to get into the weeds just enough spread of information people can fill in the gaps themselves and/or get something of a bigger picture. Some reviewers manage to do it, sometimes there is some quite interesting information being missed.
 
“You answered your own dilemma there though - if you need the faster CPU for the long term, you choose the better one by the 1080p benchmarks. If you're going to buy another one in a year, buy according to your budget using the same benchmarks.”
Buying according to my budget doesn’t answer the question. The only way to asnwer the question is to have the 1440p and 4k benchmark on top of the 1080p benchmarks.



“There is no instance where a £300 cheaper CPU is worse at 1080p and better than a £300 more expensive one is at 4k, which is the only way your dilemma would exist.”
Again you have misrepresented what I said. There are plenty of situations where diffrent CPU’s have a wide performance range at 1080p and a tiny performance gap at 4k. At which point you can by a cheaper CPU and match it up to your target performance range for a better cost to performance ratio. But you cannot work out precisely which CPU works best under this situation unless you have the 1440p and 4k benchmarks.

The 1080p benchmarks alone don’t tell you the info you need to work out which CPUs are going to give similar performance level in game X and resolution X. That’s why I prefer the reviews to have both the 1080p and the 1440p/4k benchmarks. The 1440p/4k benchmarks as additional data are really useful and provide a more complete picture to work out which CPU/GPU match up I might want to buy.

Telling me to pick a CPU in my budget doesn’t asnwer the question. Looking at the 1440p/4k benchmark I can mostly asnwer my question. Looking at the 1080p only benchmark and I have no accurate asnwer. The 1440p/4k benchmark allow me to refine my options better then just the 1080p benchmarks alone.
 
There are plenty of situations where diffrent CPU’s have a wide performance range at 1080p and a tiny performance gap at 4k.

Yes, that is precisely what I and others have been saying all along, and what the direct quote in your post says. There is consistently a tiny or no performance gap at 4k. At that point you are benchmarking a 4090, not a CPU.

You're flip-flopping around so much I don't even know what point you're trying to make (aside from this wierd idealogical crusade against accurate CPU testing). You're ignoring points where you have no answer and quoting stuff you've clearly misunderstood. If you want to buy Intel, buy Intel. Nobody on Youtube, tech website or a forum is stopping you. You'll notice little to no difference at 4k in 2024.
 
<snip> But you cannot work out precisely which CPU works best under this situation unless you have the 1440p and 4k benchmarks. <snip>
Actually, you can.

If in the CPU testing, let's say a 12600K can generate 150FPS max in a scenario which is NOT GPU limited, e.g. 1080P testing, then that is the MAXIMUM that CPU can generate regardless of the resolution, or GPU. Now compare these results with the GPU testing for the same games. If at 4K a given GPU can render 90FPS, then that system regardless of CPU used will AT BEST render 90FPS. If it can render 120FPS at 1440P, it will render 120FPS AT BEST regardless of CPU, etc. You could put the fastest CPU in the world in that system and still only get 90FPS at 4K, or 120FPS at 1440P. In this scenario the limit is the GPU, not the CPU.

If the CPU cannot render 150FPS, then REGARDLESS of which GPU you pair it with your maximum frame rate will be CPU limited.

High resolution testing is almost always GPU limited, thus doesn't show what the CPU COULD do. CPU testing is done at low resolution because it eliminates the GPU limit from the equation.
 
Last edited:
Reality is though, by the time the 1080p performance of these CPUs becomes relevant the vast majority of people who are either using GPUs which won't benefit from that CPU performance or playing at resolutions where it is less important will be moving on anyhow. Fact is many CPU reviews don't cover enough of the broad picture while they'll happily benchmark 45 games at 1080p...

Also pure gaming is one thing, some of these X3D CPUs aren't ideal if you also do a bunch of related activities like streaming/OBS, etc. at the same time as playing.

It's why I'm still very hesitant about the 9800x3d. All benchmarks whether it's GPU or CPU are done on stripped down clean windows 11 installs, with nothing running in the background and when I see games smash all 8 cores like stalker 2, I get antsy as I have tons of background stuff and a cluttered OS
 
Last edited:
New Microcode out and Asus BIOS that mentions improves performance:

 
Yes, that is precisely what I and others have been saying all along, and what the direct quote in your post says. There is consistently a tiny or no performance gap at 4k. At that point you are benchmarking a 4090, not a CPU.

You're flip-flopping around so much I don't even know what point you're trying to make (aside from this wierd idealogical crusade against accurate CPU testing). You're ignoring points where you have no answer and quoting stuff you've clearly misunderstood. If you want to buy Intel, buy Intel. Nobody on Youtube, tech website or a forum is stopping you. You'll notice little to no difference at 4k in 2024.
There you go again mispresenting what I am saying yet again. I don’t have a weird ideological crusade against accurate CPU testing, if that's what you have come away with and think then clearly your the one wrong here. I am all for complete CPU testing across the entire range 1080p, 1440p and 4k as it matters for a complete picture. The weird crusade is saying only look at 1080p and nothing else. You seem to want to tell me to not look at the 1440p and 4k data despite it helping me make a more informed choice on the optimal CPU and GPU to go for. The 1080p benchmarks alone don’t provide me with the information I require for my next full system build. Which you keep skipping over that key point. That’s why I want to see 1440p and 4k benchmarks in CPU reviews. The 1080p benchmarks matter more but there is still useful data to get from the 1440p/4k benchmarks when placed alongside the 1080p benchmarks.

Since you don’t get it, my point is sometimes the 1440p and 4k data is useful as additional data alongside the primary 1080p data.
I am not arguing against 1080p benchmarks, I am saying its sometimes useful to see the 1440p and 4k benchmark as additional benchmarks.

Its not about buying Intel v AMD. I haven't even decided which brand I am going with this winter for my system rebuild.
 
Last edited:
It's why I'm still very hesitant about the 9800x3d. All benchmarks whether it's GPU or CPU are done on stripped down clean windows 11 installs, with nothing running in the background and when I see games smash all 8 cores like stalker 2, I get antsy as I have tons of background stuff and a cluttered OS
I think a cluttered windows OS will impact CPU limited gaming regardless of the CPU being used. A clean windows 10 for example for me results in better CPU limited Cyberpunk 2077 performance than this Win11 with both a 9800X3D and a 7950XD. And running Win10 using the built in Administrator account results in yet more performance :mad:
 
It's why I'm still very hesitant about the 9800x3d. All benchmarks whether it's GPU or CPU are done on stripped down clean windows 11 installs, with nothing running in the background and when I see games smash all 8 cores like stalker 2, I get antsy as I have tons of background stuff and a cluttered OS

I'm letting the higher temperatures settle on my brain!!

I've always been a temperature nut, so the idea of a system that hits 85+ degrees takes some getting used to. At least the intel CPU's use significantly less power when they are idle and run ten degrees cooler, even if they are power-pigs when loaded.

But by all accounts the intel is no better than my current CPU with games and that's difficult to swallow.
 
It's why I'm still very hesitant about the 9800x3d. All benchmarks whether it's GPU or CPU are done on stripped down clean windows 11 installs, with nothing running in the background and when I see games smash all 8 cores like stalker 2, I get antsy as I have tons of background stuff and a cluttered OS
Good post as this something to question and discuss as this is an area in CPU testing that may be missed though I get why they do this on a sanitised system as its a scientific test and consistency is a key part of that.

Look at Zen 5 lanuch review benchmarks on Linux and Windows pre patch gave different results and people using just the Windows numbers are missing key infomation had people not highlighted that Linux gaming was better and that not all the information is there in a CPU benchmark. though wierdly some people are trying to any questioning and discussion and wanting more information is somehow wrong or a Intel v AMD problem.
 
stripped down clean windows 11 installs, with nothing running in the background
in defense of testers I would say very few do any cleaning except (sometimes) turning off Windows core isolation
there were attempts to simulate gaming with background CPU activity, but mostly end up either no effect on game perf at all (standard set of background Steam, Discord, browsers etc), or a mixed and non-repeatable result depending on how windows decides to schedule tasks in every case
 
There you go again mispresenting what I am saying yet again. I don’t have a weird ideological crusade against accurate CPU testing, if that's what you have come away with and think then clearly your the one wrong here. I am all for complete CPU testing across the entire range 1080p, 1440p and 4k as it matters for a complete picture. The weird crusade is saying only look at 1080p and nothing else. You seem to want to tell me to not look at the 1440p and 4k data despite it helping me make a more informed choice on the optimal CPU and GPU to go for. The 1080p benchmarks alone don’t provide me with the information I require for my next full system build. Which you keep skipping over that key point. That’s why I want to see 1440p and 4k benchmarks in CPU reviews. The 1080p benchmarks matter more but there is still useful data to get from the 1440p/4k benchmarks when placed alongside the 1080p benchmarks.

Since you don’t get it, my point is sometimes the 1440p and 4k data is useful as additional data alongside the primary 1080p data.
I am not arguing against 1080p benchmarks, I am saying its sometimes useful to see the 1440p and 4k benchmark as additional benchmarks.

Its not about buying Intel v AMD. I haven't even decided which brand I am going with this winter for my system rebuild.

Maybe explaining some of the nuts and bolts folks are trying to get at here...

The games engine, minus a few TIIIIIIIIIIINY details (maybe some physics calcs that matter for what the GPU is doing) is using... let's just boil it down to a silly metric and say "CPUNESS"

So games engine runs at 500 CPUNESS.
The job of this engine is to run all the calculations/etc/etc needed in the game.
It's ALSO making draw calls to whatever API.
So "here, directx, I need <THIS> wireframe of a horse wrapped in <THIS> texture and rendered with current settings please".

From this point in, it's (fundamentally) _ALL_ GPU.

So:
CPUNESS @ 1080p: 500
CPUNESS @ 1440: 500
CPUNESS @ 4K: 500

Directx looks at: The resolution, lighting, texture quality, AA/AF in use, whatever other stuff is needed to render the frame with the current graphics settings, takes the instructions from the engine's draw call as to what is where and renders it. This is _ALL_ GPU. That's 1 frame of "FPS".

So if its a VGA rendering at 640x400 or 8k, the games engine and thus the CPU _DOES NOT CARE_. It only cares about making calls.

The GPU has to produce everything else.

If the draw call+directx (or other API) settings causes the GPU a lot of work, GPU usage goes up (until GPU is maxed then CPU usage will start to drop).
If the draw call+directx (or other API) settings causes the GPU a little work, GPU usage goes down (until CPU is maxed then GPU usage will stabilise).

At the extremes (not very extreme really) ends, if the draw call results in little work for the GPU, CPU usage goes up/is maxed as the draw call interrupt is cleared and the next call is ready to be serviced (so... 1080p in this case) CPU can't make enough calls to produce high load on the GPU before it's maxed out. If the draw call results in a lot of work for the GPU, the CPU sits on it's next draw call for a few microseconds longer and gives the GPU the next draw call. Eventually the GPU is maxed and the CPU sits waiting for it to catch up and take the next call and you see CPU usage drop.


Now, the point I think you're trying to get to is: "ok, what I want though is to know what CPU will be enough to saturate my GPU at X*Y res/details without overshooting?".

So if a... 3800X + 4090 @ 4K see's 100% CPU/97% GPU, what's the minimum CPU needed for the CPU to be at 95% and the GPU at 100%? Is that what you're driving at?


Outside of this specific need... there's (fundamentally) nothing the CPU is doing that affects 1440p or 4k performance. The metric of 1440p and 4K benchies in these cases for a CPU test would be ultimately pointless as the extra load being generated isn't subject to the strength of the CPU. Unless every CPU test has a range of benchmarks with a variety of cards of different powers showing them in combination, the 1440p or 4K score in these cases is NOTHING to do with the CPU, which is the part being tested.

The CPU runs the engine, maybe a bit of physics and makes draw calls to whatever API. It's a completely flat, steady workload (relative to resolution being run at).
The GPU load is the ONLY thing that changes when you change res/detail.

There's no CPU in existence that would make ANY difference to 1440p or 4K scores. It's COMPLETELY down to the GPU.
So the only score that matters is one that is the least polluted by higher GPU (commonly 1080p) settings that would have a chance to max out the GPU and leave the CPU waiting for it and be under-utilised.
 
Last edited:
Back
Top Bottom