• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core i9-11900KF 3.5GHz (Rocket Lake) Socket LGA1200 Processor - Retail

Status
Not open for further replies.
If I cared that much about cost / performance I would get another Xbox Series X. Cost effective isn't the aim here.

Fair enough, as I said your money to burn. Also an Xbox isn't the same as a PC, I can't edit photo's and video on the Xbox, to name one thing.

The attitude you are displaying here isn't going to win you anybody's agreement btw, since a vast majority will go on evidence based decisions rather than emotional ones, so as you said wait two weeks and you might find some agreement, but leaked CPU-Z screenshots aren't helping your argument.
 
In what way do you think a ~ 20% faster IPC CPU with over 200% of the PCI-E bandwidth is going to be inferior for gaming?

Because they've ruined the intercore latency bodging the architecture onto 14nm. Have you looked at the Anandtech numbers or just the Intel marketing slides you've been given? Plus they had to lop off 2 cores to get the thing viable.

What resolution do you game at out of interest?
 
Because they've ruined the intercore latency bodging the architecture onto 14nm. Have you looked at the Anandtech numbers or just the Intel marketing slides you've been given?

What resolution do you game at out of interest?

And they have upped the cache to compensate. The difference is peanuts and still changing subject to microcode changes as the article mentions:

https://images.anandtech.com/doci/16535/New Structural Latency_575px.png





And I run 1080p - 4K depending on frame rate target vs detail trade off for the game in question. I have a 240fps monitor and a 144fps monitor atm. And a QLED Neo on order for large screen / mostly Xbox / proper HDR1000+
 
Last edited by a moderator:
And they have upped the cache to compensate. The difference is peanuts and still changing subject to microcode changes as the article mentions:

https://images.anandtech.com/doci/16535/New Structural Latency_575px.png

And I run 1080p - 4K depending on frame rate target vs detail trade off for the game in question. I have a 240fps monitor and a 144fps monitor atm. And a QLED Neo on order for large screen / mostly Xbox / proper HDR1000+

That's core to cache latency. Not core to core:

"The core-to-core numbers are interesting, being worse (higher) than the previous generation across the board. Here we are seeing, mostly, 28-30 nanoseconds, compared to 18-24 nanoseconds with the 10700K. This is part of the L3 latency regression, as shown in our next tests.

One pair of threads here are very fast to access all cores, some 5 ns faster than any others, which again makes the layout more puzzling.

Update 1: With microcode 0x34, we saw no update to the core-to-core latencies."
 
Last edited by a moderator:
hey have upped the cache to compensate. The difference is peanuts and still changing subject to microcode changes as the article mentions:
Fair enough, as I said your money to burn. Also an Xbox isn't the same as a PC, I can't edit photo's and video on the Xbox, to name one thing.

The attitude you are displaying here isn't going to win you anybody's agreement btw, since a vast majority will go on evidence based decisions rather than emotional ones, so as you said wait two weeks and you might find some agreement, but leaked CPU-Z screenshots aren't helping your argument.

I don't do that on my PC either. And the evidence we have right now is that Rocket Lake has significantly faster single threaded performance than anything else.
 
Ordered mine as likely the fastest gaming CPU money can buy at least until AMD release something new. Surprised it's not mentioned here that Rocket Lake is for pre-order sale on the OC site. Get em while you can!

Oh dear, dude, virtually every review site has shook their head at the samples they've have. Watch Leo on kitguru!
He sakes for the 900, and meh at the 700, but nods for the lowers.
It won't be the fastest gaming CPU, no one seems to be even contemplating that unfortunately.
Not even intel.
They're talking up how great the iGUP is within, which if you game will be the very first thing you disable.
Soz :/
 
Oh dear, dude, virtually every review site has shook their head at the samples they've have. Watch Leo on kitguru!
He sakes for the 900, and meh at the 700, but nods for the lowers.
It won't be the fastest gaming CPU, no one seems to be even contemplating that unfortunately.
Not even intel.
They're talking up how great the iGUP is within, which if you game will be the very first thing you disable.
Soz :/

I'm buying the version without any GPU. And based on the per thread performance benchmarks we have seen so far Rocket Lake should be fastest for more games than any other option right now:

https://cdn.wccftech.com/wp-content...gship-Desktop-CPU-Passmark-Performance-_1.png
 
Last edited by a moderator:
I'm buying the version without any GPU. And based on the per thread performance benchmarks we have seen so far Rocket Lake should be fastest for more games than any other option right no

Passmark uses AVX512 to massage Intel's numbers. Extrapolating that to gaming is nonsense and you know it Dave. We've done all this in your old threads.

What games use AVX-512?
 
mayeb he goes by the token on usershillbench, which alreayd have listigns for all the new intel chips, right at the top, even with 0 reviews and 0 ratings, but yet they come out at 110% average bench.
Going by that standard how can you ryan shrout shill be wrong?
 
Last edited by a moderator:
Out if interest, are those Passmark tables based on base clock as shown or something else? So say the 11900k OC's to 5.3ghz easily, will it be way ahead of the rest?

You might have to wait a bit for his/her answer to that question. Ryan will sound asleep at the moment, so won't be able to send a corporate response.
 
rog swift, or rog strix?
Out if interest, are those Passmark tables based on base clock as shown or something else? So say the 11900k OC's to 5.3ghz easily, will it be way ahead of the rest?

I doubt it's base. Would be default settings so single core max turbo freq. (Which many OCers will switch to all cores).

rog swift, or rog strix?

Strix. Corrected above.

What happened to your 6800 gpu

Never wanted one of those. No DLSS and sucky ray tracing. Waiting to see 3080 Ti before I decide what's next there.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top Bottom