• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Core Ultra 9 285k 'Arrow Lake' Discussion/News ("15th gen") on LGA-1851

I think that the reviewers are telling us which is best rather than presenting the information and allowing us to decide. FPS at 1080P isn't the only metric I use to decide so I have to try to ignore what they are saying and consider all the options myself.
The information is right there for you to decide - do you really think it is useful having results of a supposed CPU comparison with all the CPUs getting the same FPS because everything is GPU limited? Most people who don't need to upgrade their CPU won't upgrade it just because HUB says it is 10% faster at 1080p. But this is an enthusiast forum and we are fans of blowing money on frivolous upgrades :p

I play at 2160p but I still want to see CPU bench results from reviewers where the GPU is not the limiter.
 
The information is right there for you to decide - do you really think it is useful having results of a supposed CPU comparison with all the CPUs getting the same FPS because everything is GPU limited? Most people who don't need to upgrade their CPU won't upgrade it just because HUB says it is 10% faster at 1080p. But this is an enthusiast forum and we are fans of blowing money on frivolous upgrades :p

I play at 2160p but I still want to see CPU bench results from reviewers where the GPU is not the limiter.

As per other threads where people have posted on this it is more the lack of other information - people will happily do 45 tests at 1080p but rarely touch more on other settings and other factors - especially if you aren't using a 4090 at 1080p. An interesting one at least for me personally is that with a 4080 class GPU currently the CPU impact on FPS is far less significant but the impact on game load times between different CPUs can be stark and noticeable - and not always correlated with the 1080p CPU performance.
 
Last edited:
so 5% before the updates
Yes, their claim was same performance at 70? watts less in gaming.

If they hit that mark, they just need a 60-70$ cut. But considering they haven't made many chips for the DIY market I don't think they will do that.
 
As per other threads where people have posted on this it is more the lack of other information - people will happily do 45 tests at 1080p but rarely touch more on other settings and other factors - especially if you aren't using a 4090 at 1080p. An interesting one at least for me personally is that with a 4080 class GPU currently the CPU impact on FPS is far less significant but the impact on game load times between different CPUs can be stark and noticeable - and not always correlated with the 1080p CPU performance.
I don't really find load times an issue in any games I have played in years outside maybe BG3 so would be interesting to note if more cores made a dent in that particular game. As for shader compilation it is probably twice as fast on 7950X3D vs 7800X3D which is certainly noticeable but usually a one time thing or only after a driver update. I can see the benefit there but I'm not sure it would be enough to be influence a purchasing decision. I would certainly prefer more FPS in game than quicker load times - in most cases. I think I would also prefer an Intel chip with 10 or 12 P-cores or AMD X3D with 10 or 12 cores on a single CCD versus having E-cores or a 2nd CCD.
 
but usually a one time thing or only after a driver update.

Unfortunately some games like Hogwarts Legacy it isn't a one time thing :( I'm not sure if that is something the devs have been too lazy to fix or whether there are genuine reasons for it - unlike some people have suggested it isn't just a loading screen which they were too lazy to label properly and just says shader compiling as if you run the game on a system without enough memory to compile shaders it almost immediately loads the game but you get quite a bit of stutter when playing.
 
I have Hogwarts installed and it takes about 47 seconds from clicking Play on steam to loading up the save game and in being in control of character including about 18 seconds of 'Preparing Shaders' - does not seem that bad?
 
I have Hogwarts installed and it takes about 47 seconds from clicking Play on steam to loading up the save game and in being in control of character including about 18 seconds of 'Preparing Shaders' - does not seem that bad?

I don't know why but a lot of people get the full 1-2 minutes preparing shaders every single startup of the game and it has never been fixed, but it was only an example (it can vary quite a bit on different hardware with that game though).
 
CP2077 on ARL fixed?

sIA4BuL.jpeg
 
I think that the reviewers are telling us which is best rather than presenting the information and allowing us to decide. FPS at 1080P isn't the only metric I use to decide so I have to try to ignore what they are saying and consider all the options myself.
If you're so set into a position that a corporation in turmoil selling overpriced, literally defective, slower CPU's that require more hardware upkeep is still keeping your 'loyalty' then what can even be said at that point? It's not about mindshare battles.

If I really wanted to be a jerk, I could imply that considering the power costs of intel stuff the last few years, it's actually immoral to buy lesser hardware for "fun" when it's leaving a worse footprint. But everyone cares about the environment unless they have to even slightly, accidentally, irrelevantly change a habit. But that's not directed at you guys, tbh. More like millenials and gen Z, and it's a bit off-topic anyways.
 
If I really wanted to be a jerk, I could imply that considering the power costs of intel stuff the last few years, it's actually immoral to buy lesser hardware for "fun" when it's leaving a worse footprint. But everyone cares about the environment unless they have to even slightly, accidentally, irrelevantly change a habit. But that's not directed at you guys, tbh. More like millenials and gen Z, and it's a bit off-topic anyways.

Actually quite interesting if you measure power use over an entire day, at the wall, for different usage patterns - I think a lot of posters here would be surprised. Amongst other factors unless you do all the calculations for power efficiency and buy the right PSU many setups are actually not that different for what they draw at the wall especially when measured over a longer time period despite the impression people often get from the reviews.

EDIT: Obviously exceptions like certain CPUs/platforms are designed for low power use like some of the AMD 8000 series, Intel T, etc.
 
Last edited:
idle power draw ect matters to a lot of people , staying cool is top of my list

Me too. Most of the time my PC is on it's not doing anything so idle power consumption is important. The 9800X3D consumes three times the power of the Ultra 7 and runs a lot hotter when idle. It's efficiency, I don't really have any comparisons. Under load, all the reviewers measurements are 1080P and unrestricted FPS so it's difficult to make any comparison at all.
 
If you're so set into a position that a corporation in turmoil selling overpriced, literally defective, slower CPU's that require more hardware upkeep is still keeping your 'loyalty' then what can even be said at that point? It's not about mindshare battles.

If I really wanted to be a jerk, I could imply that considering the power costs of intel stuff the last few years, it's actually immoral to buy lesser hardware for "fun" when it's leaving a worse footprint. But everyone cares about the environment unless they have to even slightly, accidentally, irrelevantly change a habit. But that's not directed at you guys, tbh. More like millenials and gen Z, and it's a bit off-topic anyways.

I'm not set at all. I think that people are over-exaggerating intel's demise and burying it before it's dead. True, they might well have messed up with the core ultra 200, but I am just waiting to see. They certainly released it too early, but maybe it can be sorted. Lets see.

Had this race been between the 14900K and the 9800X3D, I would have bought the 9800X3D. I can't see a single advantage to the 14900K. But the core ultra 200 is very interesting, but only if they can match the gaming performance of the 14900K. If they can't do that then I will not buy it.

My choices have nothing to do with the environment.

As I have said before, my problem is that the 9800X3D is not being presented in a way that I find it attractive. The reviewers are presenting information that is useless to me.
 
Last edited:
It should've been obvious that there's no saving ARL for gaming just from all the OC and tuned results. If OC'ed to the gills and with super-fine-tuned memory it didn't perform well then what could a microcode update really do - nothing, just PR selling hope.

Really the worst part is also having to deal with outlier results like Cyberpunk for which you can only pray for a fix, or worse the anti-cheat stuff where now you can't even play the game at all. A total joke.

If it was really cheap then perhaps you could contort a rationalisation for buying it, but the reality is it just makes no sense (not that it really matters, OEMs will sell it by the ton).
 
Is it true fortnight doesn't run on these cpus? I must be reading this stuff wrong. There's no way you'd release a CPU that doesn't work with the biggest game out there? Not that I've ever seen fortnight before
 
Is it true fortnight doesn't run on these cpus? I must be reading this stuff wrong. There's no way you'd release a CPU that doesn't work with the biggest game out there? Not that I've ever seen fortnight before
just watched a clip of someone playing it so think that is a myth

a little like the thread after thread of deltaforce not working with X cpu + gpu when it runs fine on my lower spec machine
 
Oh, well, having said that, if this is true, then its a 9800X3D for me...


Had my doubts since I first saw the details of the architecture changes and personally don't think they can properly fix it. Intel can't even be bothered to support and update APO properly for 14th gen and older chips where it is quite possible to bring some decent gains in a wide range of software but instead APO is just languishing barely if ever updated or improved.

EDIT: Personally still lean towards the 14700K - unless absolutely focussed on pure FPS numbers with 1080p and a 4090, it provides better performance in every other respect (and/or is very close numbers under realistic situations) at a semi-reasonable price point and/or unless the power/thermals are really that big a deal to people - my setup sips power when idle or at low utilisation and only really puts up disgusting numbers under Cinebench - for gaming it isn't really that terrible mostly.
 
Last edited:
Oh, well, having said that, if this is true, then its a 9800X3D for me...

I mean, 3% more performance for 10% less power is a pretty nice jump in perf/watt. But I'm not really sure what people were expecting, ARL performed as adversited in most productivity apps and 70% of games. Yes, there will be some outliers and some issues to iron out, but in no way there was going to be a miracle patch. If they manage to get it to 7800x3D levels it just need a price reduction to be an attractive option.

However, as many industry outlets have pointed out, Intel is making very few chips for the DIY market so I wouldn't hold my breath for a price cut.
 
Back
Top Bottom