• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

I still dunno what the point of 1080p ultra benchmarks are - either 1080p low to bench the CPU or higher res ultra to benchmark the GPU and/or show the significance of the CPU at the kind of resolutions and settings people will be playing at... I'd imagine most people aren't buying a 285K to pair up with a RTX4060...

1080P is still the primary resolution at 57% according to Steam, but that's not really the point, when you're testing how good the CPU is at driving the GPU in games you want to set it up so that the CPU, not the GPU becomes the bottleneck, otherwise its not a review of the CPU's gaming performance, its a GPU review

I have had my 5800X for 4 years, the GPU i have it paired with now is faster than the fastest GPU available when that CPU was reviewed, WAY faster.... Anandtech actually went the extra mile and tested at 720P or even lower, that showed the 5800X to be 10 to 20% faster than a 10900K i was also considering at the time with them being similarly priced.
Other reviewers tested at 1080P or higher and while they also showed the 5800X to be the best gaming CPU it was by a much smaller margin.
Today when you see these CPU's on the same slide with modern GPU's you see that >20% difference between the 5800X and 10900K on all of them.

I knew that when i bought it, because i read the Anandtech review, as a result of that my modern GPU has its best chance with this now 4/5 year old CPU.

I'm a huge advocate of low resolution testing, because if you're not doing that you're not testing the CPU properly.

RIP Anandtech.
 
Last edited:
1080P is still the primary resolution at 57% according to Steam, but that's not really the point, when you're testing how good the CPU is at driving the GPU in games you want to set it up so that the CPU, not the GPU becomes the bottleneck, otherwise its not a review of the CPU's gaming performance, its a GPU review

I have had my 5800X for 4 years, the GPU i have it paired with now is faster than the fastest GPU available when that CPU was reviewed, WAY faster.... Anandtech actually went the extra mile and tested at 720P or even lower, that showed the 5800X to be 10 to 20% faster than a 10900K i was also considering at the time with them being similarly priced.
Other reviewers tested at 1080P or higher and while they also showed the 5800X to be the best gaming CPU it was by a much smaller margin.
Today when you see these CPU's on the same slide with modern GPU's you see that >20% difference between the 5800X and 10900K on all of them.

I knew that when i bought it, because i read the Anandtech review, as a result of that my modern GPU has its best chance with this now 4/5 year old CPU.

I'm a huge advocate of low resolution testing, because if you're not doing that you're not testing the CPU properly.

RIP Anandtech.
Precisely. I still remember when AMD fans were ****** off when Intel had a significant advantage against bulldozer saying nobody gamed on 720p. It’s a great indicator of cpu longevity although now even an Intel/Ryzen 3 gives you +100 fps in most games even at lower res. (Wallet breathes sights of relief)
 
You test ultra because higher settings put a higher load on the CPU, so if you want to know what CPU could give you the FPS you want at the settings you'd play at (and in single-player, that's by definition max settings or near) then you have to max out those settings (including RT). You only test low settings for competitive titles because you want the highest fps and nothing else (w/ exceptions depending on visual changes giving an advantage or not).

That's why if you want to examine CPU performance the #1 thing you change is resolution, and if you see people testing at 1080p low in such titles that's a dead giveaway that they're clueless when it comes to testing.

That only makes sense if you selectively set things like crowd density to ultra, a lot of visual settings on ultra even at lower resolution will massively shift load to the GPU.

EDIT: Though I agree with the overall point - personally I only bother with the results for the settings I'm going to play at and lower resolution with those settings can give an indication of future GPU performance but ultimately that is why you need tests at more than just one resolution/settings profile.
 
Last edited:
That only makes sense if you selectively set things like crowd density to ultra, a lot of visual settings on ultra even at lower resolution will massively shift load to the GPU.
people want to know the performance at the settings they actually use though.
It's kinda redundant to be testing games at low settings.

4090 1080p max settings the bottleneck probably isn't going to be a 4090?
 
Last edited:
I'm running my RAM at 7200 now, latency is somewhat better than before.

IcvLHYt.png
 
@Robert896r1 is any of your tuning dependent of silicon quality and the silicon lottery?

I mean the silicon lottery on the Intel side, ignoring any well-binned memory chips. I guess without a wide sample it is a hard question to answer what role, if any, the CPU memory controller etc. play.
 
people want to know the performance at the settings they actually use though.
It's kinda redundant to be testing games at low settings.

4090 1080p max settings the bottleneck probably isn't going to be a 4090?

They do that when they review GPUs paired with the fastest CPU
 
Back
Top Bottom