• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

...Personally I have a lot of problems with CPU reviews not giving enough spread of information and often misleading i.e. counter-intuitively the power use of some CPUs in gaming at 4K rises significantly over 1080p, etc.

This could be correct but do you have any examples of games where this is shown to be the case?
 
Higher power draw at 4k would only make sense if the cpu is working harder and that's only possibly if the game is doing something different based on the resolution - for example graphical effect that scales with resolution and uses the CPU to calculate
 
7800X3D some reviewers found power draw increased with resolution, I'm not sure why, it has also been seen with some Intel 11th gen.

One would presume it's to do with core utilisation. Some graphics stuff easier to parallelise than core gameplay?
 
One would presume it's to do with core utilisation. Some graphics stuff easier to parallelise than core gameplay?

I don't know and unfortunately not enough reviewers do testing to tell whether it is actually a thing or a mistake on the part of those reviewers for example:

When comparing the AMD Ryzen 7 7800X3D to the more expensive Ryzen 9 7950X3D, we noticed that the latter draws the most power when gaming at 1080p in F1 2022 with high settings. This is because 1080p is a resolution that relies more on CPU power than GPU power. During our testing, we observed power levels ranging between 71 and 76 watts on the 7950X3D at 1080p. However, when we switched to 4K, which is more GPU-dependent, the power draw on the 7950X ranged between 55 and 70 watts during our test run.


The behavior of the AMD Ryzen 7 7800X3D while gaming becomes even more intriguing and slightly strange, as it draws more power from the CPU at 4K than at 1080p. Although we cannot determine the exact reason for this, we do know that some CPU-driven visual effects in modern games are resolution sensitive, meaning there are edge cases where portions of the CPU workload can increase.


It isn't the only CPU which exhibits that behaviour though.
 
But viewership is how those videos are funded - through advertising share, through sponsorships, through merch-sales, and through patreon - which means consumers are customers.



The point of reviews is to educate consumers on their purchase choices, and part of how that is done is educating consumers on how to interpret the information they're receiving. The customer being wrong is an inherent part of any communication which is intended to inform - you want them to be less wrong after they've watched your video.



On the contrary, 1080 is the most relevant to most users (since most users still use 1080p) and because it's the one which is most capable of reflecting differences in CPU performance. A CPU purchased today doesn't just need to run today's games well, it also needs to run games 4, 5, or more years in the future well. A benchmark that is GPU limited can't tell you anything about that.

And so do many other users, I see, constantly complaining that 1080P is not relevant to them.

For example, I was all hyped up by the 9800XD, showing massive performance gains over intel in the reviews. But once I started to spend time, digging under the hype, the truth is that I am unlikely to see a worthwhile improvement over my current CPU. I didn't get that from the popular reviews, I got it from comparisons from enthusiasts.

I don't care what their reasoning is, what I care about is that the reviews initially led me astray.

They need to change their tests to be more oriented towards realistic situations. How they do that - that's not my problem.

And I completely disagree with you about users - I don't think for an instant that people buying cutting edge gear are playing 1080P.

We clearly aren't going to agree, and it's obviously pointless restating the same arguments, so might as well agree to disagree. I shall remain silent!
 
And so do many other users, I see, constantly complaining that 1080P is not relevant to them.

For example, I was all hyped up by the 9800XD, showing massive performance gains over intel in the reviews. But once I started to spend time, digging under the hype, the truth is that I am unlikely to see a worthwhile improvement over my current CPU. I didn't get that from the popular reviews, I got it from comparisons from enthusiasts.

I don't care what their reasoning is, what I care about is that the reviews initially led me astray.

They need to change their tests to be more oriented towards realistic situations. How they do that - that's not my problem.

And I completely disagree with you about users - I don't think for an instant that people buying cutting edge gear are playing 1080P.

We clearly aren't going to agree, and it's obviously pointless restating the same arguments, so might as well agree to disagree. I shall remain silent!


Let's be real here. If the CPU that's being reviewed is not showing performance gain at. 4k it's because the GPU is too slow at that resolution. So the real question is why are you even looking at a potential new CPU when your GPU is too slow?

That's the catch 22 for all the people who say they want 4k benchmarks - why do you want 4k benchmarks when your GPU is too slow, maybe you should wait till you get a better GPU, or did you really think the framerate was going to go up when your GPU is at 100% load and you slap a new CPU in, how naive

That's why Steve gave some great advice when he said 1) look at your GPU utilisation in the resolution you play games at and 2) decide what framerate you want from your game. If your GPU utilisation is 100% you need a better GPU, stop looking at CPU reviews, no CPU is going to give you more frames because your GPU is too slow mate. People seem to have a misunderstanding of what a CPU does; it doesn't generate frames
 
Last edited:
For example, I was all hyped up by the 9800XD, showing massive performance gains over intel in the reviews. But once I started to spend time, digging under the hype, the truth is that I am unlikely to see a worthwhile improvement over my current CPU. I didn't get that from the popular reviews, I got it from comparisons from enthusiasts.

I don't care what their reasoning is, what I care about is that the reviews initially led me astray.
As someone that does a lot of specs for new builds/upgrades on the forum, it is something that I agree buyers would benefit from addressing, even if it added a lot of testing time that most reviewers would consider pointless.

I see a lot of upgrades that I don't personally consider will be worthwhile and I do believe the hype from reviews plays a large part in pushing buyers into the next cycle.

I try to provide links to videos and articles instead of just: "Tetras said so", but every time I do that, I can't add a bunch of disclaimers about how to accurately interpret those results (..especially within their personal context). Steve's data is useful for me, because I can do that, but can the average viewer? I suspect they can't. I guess that's why there's often concluding statements in GN's videos like: "if you have a previous gen CPU, this is not really meant for you".

Arrow Lake is something of a case in point there, nobody really asks about it (I've done maybe 2, 3 specs since it launched a month ago) and I think that's partly because the hype from reviews is: 9800X3D great, Intel bad. I liked der8auer's review because I thought he gave me some good points to put the platform/launch in perspective and e.g. for some productivity use cases the 265K might actually make some sense, even for a person who games a lot.
 
As someone that does a lot of specs for new builds/upgrades on the forum, it is something that I agree buyers would benefit from addressing, even if it added a lot of testing time that most reviewers would consider pointless.

I see a lot of upgrades that I don't personally consider will be worthwhile and I do believe the hype from reviews plays a large part in pushing buyers into the next cycle.

I try to provide links to videos and articles instead of just: "Tetras said so", but every time I do that, I can't add a bunch of disclaimers about how to accurately interpret those results (..especially within their personal context). Steve's data is useful for me, because I can do that, but can the average viewer? I suspect they can't. I guess that's why there's often concluding statements in GN's videos like: "if you have a previous gen CPU, this is not really meant for you".

Arrow Lake is something of a case in point there, nobody really asks about it (I've done maybe 2, 3 specs since it launched a month ago) and I think that's partly because the hype from reviews is: 9800X3D great, Intel bad. I liked der8auer's review because I thought he gave me some good points to put the platform/launch in perspective and e.g. for some productivity use cases the 265K might actually make some sense, even for a person who games a lot.

Yes, I mean I do get that "is it worth upgrading?" is a much more difficult question, but if it is a question people have, then it is a question the reviewers should be trying to answer.

My concern, I suppose, is that the reviewers are destroying or hyping components based on unrealistic tests. The majority of people upgrading are not going to see the major improvements the CPU tests suggest.

Even when comparing the Ultra 7 to the 14700K, if the CPU is not a bottleneck, why are they recommending it or destroying it based on the "assumption" that it is? Or will be in the future? They almost seem to be going out of their way to make dramatic news where there isn't any. They need more realistic tests, or to stop being so dramatic.

I suppose the good news, as a consumer, is they have destroyed the intel chips now, and that may well force their prices low, even if intel offer up fixes for some of the weird problems they have.
 
Last edited:
True but people on 1080p are often buying towards the lower end of CPUs and in 5 years time if CPU limited you are often in a situation where the difference is reduced to like 33 FPS Vs 35 FPS or a CPU which didn't do great originally is now the more useful due to now being able to use all its cores, etc.

Personally I have a lot of problems with CPU reviews not giving enough spread of information and often misleading i.e. counter-intuitively the power use of some CPUs in gaming at 4K rises significantly over 1080p, etc.

I recall one reviewer saying that if they didn't run the tests at 1080P, all the results would be the same. Well, DUH!
 
Let's be real here. If the CPU that's being reviewed is not showing performance gain at. 4k it's because the GPU is too slow at that resolution. So the real question is why are you even looking at a potential new CPU when your GPU is too slow?

That's the catch 22 for all the people who say they want 4k benchmarks - why do you want 4k benchmarks when your GPU is too slow, maybe you should wait till you get a better GPU, or did you really think the framerate was going to go up when your GPU is at 100% load and you slap a new CPU in, how naive

That's why Steve gave some great advice when he said 1) look at your GPU utilisation in the resolution you play games at and 2) decide what framerate you want from your game. If your GPU utilisation is 100% you need a better GPU, stop looking at CPU reviews, no CPU is going to give you more frames because your GPU is too slow mate. People seem to have a misunderstanding of what a CPU does; it doesn't generate frames

I agree.

But the reviewers tend to hype things up way too much, given that most of the people rushing out to buy a new CPU won't be playing at 1080P and 12000FPS. They sell it as THE ONLY ONE FOR YOU....when in fact that's maybe not true. In the small print they mention something about bottlenecks. I mean the excitement on the internet over the 9800X3D is insane. But for what? For a CPU that you can't actually use until you buy a 6090? Stop to think of that. How utterly useless is that information?

I think they could do a better job, and I really think they need to stop being so dramatic and hyping things up so much. In many cases the 9800X3D may be a total waste of money for people, yet that fact has hardly been mentioned by the media at all, and if it is mentioned, only because a few people have made a noise about it.
 
Let's be real here. If the CPU that's being reviewed is not showing performance gain at. 4k it's because the GPU is too slow at that resolution. So the real question is why are you even looking at a potential new CPU when your GPU is too slow?

That's the catch 22 for all the people who say they want 4k benchmarks - why do you want 4k benchmarks when your GPU is too slow, maybe you should wait till you get a better GPU, or did you really think the framerate was going to go up when your GPU is at 100% load and you slap a new CPU in, how naive

That's why Steve gave some great advice when he said 1) look at your GPU utilisation in the resolution you play games at and 2) decide what framerate you want from your game. If your GPU utilisation is 100% you need a better GPU, stop looking at CPU reviews, no CPU is going to give you more frames because your GPU is too slow mate. People seem to have a misunderstanding of what a CPU does; it doesn't generate frames
We want the 4k benchmarks so we know if the limit is at the CPU or GPU for game X with GPU X. How would you know the CPU or the GPU is too slow without the 4k benchmarks? That's why we need the 1080p and 4k benchmark in the same review for new CPU's and new GPU's.

Its also not correct that no CPU will give more frames because your GPU is to slow. Its far more complicated then that in many games. I have had plenty of games that have been GPU limited but still gained an average FPS increase from a new CPU.
 
I agree.

But the reviewers tend to hype things up way too much, given that most of the people rushing out to buy a new CPU won't be playing at 1080P and 12000FPS. They sell it as THE ONLY ONE FOR YOU....when in fact that's maybe not true. In the small print they mention something about bottlenecks. I mean the excitement on the internet over the 9800X3D is insane. But for what? For a CPU that you can't actually use until you buy a 6090? Stop to think of that. How utterly useless is that information?

I think they could do a better job, and I really think they need to stop being so dramatic and hyping things up so much. In many cases the 9800X3D may be a total waste of money for people, yet that fact has hardly been mentioned by the media at all, and if it is mentioned, only because a few people have made a noise about it.

I didn't really see that much hype from reviewers, they were simply reporting that the performance uplift in games from previous gen (7800X3D) and current-gen Intel was impressive. Which it is.

On the "CPU you can't actually use" comment (which is a total fallacy), I can give you my own personal experience as to why you'd buy a 9800X3D today.

Last summer I was looking to upgrade my aging 8700k. I was looking at CPUs and the consensus was that the 7800X3D was king. But for me, the price to upgrade to AM5, DDR5 memory and £200+ motherboards was too much to bear. Looking around, the AM4 options were cheap, offered a big uplift from 8th gen Intel and apparently would offer as much benefit in 1440p as the 7800X3D. I decided to go for the 5700x as, again, it was cheaper than the 5800X3D and was nearly as good.

18 months down the line, I'm CPU bottlenecked and looking to upgrade again. Normally I upgrade every 5ish years. The point is, I wish I'd bought the best CPU I could, rather than just go for the best price/performance now. If I'd looked at the 1080p benchmarks when buying the 5700x, I could've deduced the longevity of it against the X3D chips.

Like people have been saying all along, if you already have a top CPU from the last gen or so, you don't need to upgrade or watch CPU reviews. But buying a 9800X3D and keeping it for 5 years, rather than buying a £200 CPU every year is absolutely not a waste of money. The fact some people need Youtubers to tell them this is worrying.

We want the 4k benchmarks so we know if the limit is at the CPU or GPU for game X with GPU X. How would you know the CPU or the GPU is too slow without the 4k benchmarks? That's why we need the 1080p and 4k benchmark in the same review for new CPU's and new GPU's.

Its also not correct that no CPU will give more frames because your GPU is to slow. Its far more complicated then that in many games. I have had plenty of games that have been GPU limited but still gained an average FPS increase from a new CPU.

But I play at 3440x1440. So then I start screaming about the lack of benchmarks for that. Then you need to add regular 1440p. You see where this leads? The point is, the 1080p benchmarks tell you everything. If a CPU A is better than CPU B at 1080p, then it will be better to a lesser degree at higher resolutions, oftentimes equal at 4k.

You know if your GPU/CPU is too slow by looking at utlisation during your game of choice. If it never goes below 95%, you're GPU limited. If it frequently drops below that, you're CPU limited. If you;re CPU-limited, buy the best CPU you can afford or whichever one you feel comfortable buying, at which point you can use 1080p benchmarks to deduce if CPU A is better than CPU B.

The people quibbling over the last of benchmarks at higher resolutions are often those who don't need to upgrade. If it's not clear whether you'll gain an extra 3fps at 4k, then you don't need to upgrade, so why look at CPU reviews? People with 13/14900ks and/or 7800X3Ds, gaming at 4k - what are you realistically expecting to gain from a new CPU?
 
Last edited:
1080p doesn’t tell you everything and there is nothing wrong with picking 3 points 1080p, 1440p and 4k. From there you can get a good indication of what’s going happen at other resolution points.

“You know if your GPU/CPU is too slow by looking at utlisation during your game of choice”
I found utilisation by itself is not a good indication and doesn’t always work. Utlisation doesn’t work well as an isolated measurement you really need the datapoints from 1080p, 1440p and 4k and multiple games to get an accurate picture. Utlisation does matter just not as an isolated measurement.


“The people quibbling over the last of benchmarks at higher resolutions are often those who don't need to upgrade.”
It’s the other way around often those are the people that do need to upgrade, I know because I am one of them. Put it this way. How am I meant to know if I am only going to gain an extra 3fps or 4fps unless those benchmarks are in the review? I need both the 1080p data and the higher resolution data to make a proper informed choice.
 
Last edited:
1080p doesn’t tell you everything and there is nothing wrong with picking 3 points 1080p, 1440p and 4k. From there you can get a good indication of what’s going happen at other resolution points.


I found utilisation by itself is not a good indication and doesn’t always work. Utlisation doesn’t work well as an isolated measurement you really need the datapoints from 1080p, 1440p and 4k and multiple games to get an accurate picture. Utlisation does matter just not as an isolated measurement.



It’s the other way around often those are the people that do need to upgrade, I know because I am one of them. Put it this way. How am I meant to know if I am only going to gain an extra 3fps or 4fps unless those benchmarks are in the review? I need both the 1080p data and the higher resolution data to make a proper informed choice.

You can get the exact same indication from the 1080p benchmarks, just to a lesser degree the higher up the resolution scale you go. It's really not that difficult to understand. the advantage of 1080p benchmarks is you test raw CPU performance, which is the entire, central, focal point of a CPU review.

Just saying utlisation doesn't matter doesn't make it so.

You compare your current CPU (or equivalent) to a contemporary CPU. Is the new one faster in 1080p benchmarks? Are you GPU limited? If yes to the latter, don't upgrade. CPUs do not give you more frames, but they can restrict your GPU. This is why util stats help. Again, really not that difficult to understand.

I'm on 6700K+GTX1080 at 1440p 144hz and I can't figure out what to buy because benchmarks in 1080p aren't reality.

Once more, say it with me; 1080p benchmarks test the raw CPU performance. Higher resolutions are influenced by the GPU which is less informative unless you have the same card as the reviewer. With a 1080 a AL or Zen5 CPU isn't going to do much if anything for you. I don't need a 4k benchmark to tell me that. If you can't work out what to buy then that is on you, not CPU testers.

Honestly, this is just willful misunderstanding at this point.
 
Once more, say it with me; 1080p benchmarks test the raw CPU performance. Higher resolutions are influenced by the GPU which is less informative unless you have the same card as the reviewer.
[...]
Honestly, this is just willful misunderstanding at this point.
Yeah, by you, doesn't matter how many times you say it.
 
4K testing can make a 13600K look at good as a 9800X3D, while that might quiet the minds of 13600K owners its completely bloody useless to those looking for the best thing money can buy.

You don't see people complaining about high resolution reviewing GPU's because people instinctively understand they are trying to eliminate the CPU as the bottleneck to give you a true measure of the performance between different GPU's, this is what people are looking for so these arguments about low resolution CPU testing are just weird.
 
They almost seem to be going out of their way to make dramatic news where there isn't any. They need more realistic tests, or to stop being so dramatic.
I'd say it is just a consequence of the clicks thing, when we had those old magazines it was kind of like a subscription, if you were an enthusiast you'd buy it every week, but with these YouTube reviews they're all competing for attention and "9800X3D is kind of good, but maybe not for you", that's not going to get many clicks. Whereas, "AMD DESTROYS INTEL 1!!!111!Eleven!" or "BEST GAMING CPU EVER!!!", that gets attention.

I know reviewers often get annoyed when people just click to the graphs or the conclusion and ignore the rest of the review, but that's the reality, a lot of buying decisions are made without the full context.

From a certain perspective, I'm not against it, because the 9800X3D is a really good CPU for a gamer, so is driving sales toward that CPU so bad? Not really. I mean, Intel admitted themselves they screwed up gaming on arrow lake at launch, but does the hype drive unnecessary sales? Yeah, it is, I can't believe the demand for these CPUs is something justified by need, though I know that there is pent up demand due to the 9000 series and Arrow Lake being disappointing.
 
Back
Top Bottom