Intel/DDR5 option:
My basket at OcUK:
Total: £1,520.82 (includes delivery: £11.98)
CPU and motherboard don't support overclocking.
Most likely the 14900K will be highest upgrade available for this socket.
For CPU upgrades I think it would be best to go AM5.
GPU upgrade: I'd get a higher wattage PSU, maybe 850, which should account for e.g. a 7900 XT or 4080 type of card in the future. ATX 3.0/PCIE5 advisable if likely to go nvidia. Case is restricted to 355mm GPU length.
4070 can do 1440p, but 4K is pushing it in the latest games without upscaling.
I chose a compact 4070 for the case and because the weight and risk of damage when carrying would hopefully be lower compared to a large 3 fan card. I also chose a glass-less case for the same reason.
Micro-ATX alternative with overclockable CPU/motherboard:
My basket at OcUK:
Total: £1,530.82 (includes delivery: £7.99)
Just my 2 pence regarding the 4070 and it's capability in real world use vs biased youtubers/journos/people clearly receiving back handers in reviews... I'll just quote what I said in another post as it sums up my overall experience - and don't get me wrong I'm an AMD/ATI guy at heart, and if this hadn't done what I wanted it to I'd of sent it packing as I'm VERY picky/moany if summit doesn't work haha!
Anyway here's what I said before regarding it...
"I have run everything you can think of at ultra settings 4k native with RT on (or 4k high on things like tlou which will still do low to mid 60s with the latest patch) on my 4070 and my max usage for vram is 9.3-9.7gb... In all the latest titles... I only play at 1440p so I can't see 12gb being an issue, especially not with dlss3.0 for the next few years.
I've also messed around with UE5 stuff and again the reality seems different to the overhyped clearly sponsored journos/youtube shills.
When you do a side by side with someone with a higher vram card the game simply allocates more vram because it's there, however their utilisation seems to be higher than mine even at the same res/settings?
So either it's developer favouritism to nvidia in terms of optimisation/utilisation/allocation or the amd cards seem to just want to use a bit more in their method of raster? No bias as I'm actually an AMD/ATi guy by preference, I have a AMD cpu in both builds and a AMD card in my 2nd rig...
As with cars I often have multiple brands/models at once as they're good in their own individual ways, so in this case I just fancied trying out something different as the idea of the 200w tbp of the 4070 over 350-450w of a 6800-6950xt paired with 53-58c without the fans even spinning vs 80-110c of a 6950xt burning my leg was a no brainer at the same price...
Plus dlss3.0 gives me a very nice bit of future proofing headroom when it starts to not be able to handle a capped 60fps in SP at high/ultra... It's nice having RT on and not crippling the system was also a nice bonus.
The win factor for me that sold it was seeing I could run 1440-4k native at 130w ultra settings when undervolted without the fan even coming on at 53-58C (the fan comes on stock at 65C on my asus 4070 anyway) so when I'm playing 1440p native at ultra settings capped to 60fps for SP games (as I don't see the point if it's not MP of higher than 60fps) the 4070 sips power at around 105-120w!
My system is silent and uses £18-22 a month including amp/speakers/monitor playing it everyday 6hours of an evening
it literally pulls 260-280w all in at the wall vs 350-450w for
just the gpu of a 6xxx/7xxx high tier card at the same settings/res...
Then with those 6xxx/7900xt(x) gpu's paired with the rest of the components/monitor/speakers/amp ends up 5-600w easily and costing me double a month in electric! F that!
Which over the year is a lot of money wasted when you're in an area with a disgusting pricing for electric, and want to enjoy limitless play time on your new system that you've just built!
People may say it's very expensive for what it is but I've already got half my money back for that year in electric saved vs spending it on the rival card that can't do rt/dlss3.0/frame generation.
So by the 2nd year it's cost me half that in what I've saved in electric, by the third year it's free... And so forth and so forth, when dlss can't help it, it'll just go in my 2nd rig as an upgrade, no biggie!
And I'll stress this point again
I am an AMD/ATI guy with AMD cpu's in both rigs and AMD gpu in the 2nd, but I can't argue with that featureset/performance/power consumption, I'd have bought this regardless of who made it (just like with cars), imho it's as impressive as the rx6600xt was for it's time punching way above it's weightclass for it's power consumption! There literally ISN'T another card that can sip 105-135w at ultra 1440p-4k native and produce the fps this can, let alone with frame generation/rt/dlss on and still not go above that wattage!
A LOT of people don't realise what their system is costing them each month let alone year, yet complain about overpriced hardware, kind of ironic, no?
Oh and it goes without saying if I'd not have been happy I'd of sent it straight back, what with everything having the no questions asked 14 day return/refund policy!"
So yeah definitely recommend if you want the nice bit of extra headroom with dlss 3.0/3.5/frame generation/super low latency mode/ray tracing, and the ability to use it at 4k with 1440p dlss textures that you can't tell in real life even on a 65" tv - trust me I have asked many mates what res they think it is natively etc and tricked them all - I personally thought dlss was rubbish when I used amd FSR on my 2nd rig and was very skeptical about this but thought screw it I'll try it and see, and yeah dlss3.0 is VERY impressive. Having gone from a 65" 4k tv to a 32" 1440p monitor it's VERY hard to tell!