As title I keep seeing GPU is more important as it's more GPU bound at 4k when people want to get better performance out of their games at 4k.
Some people are still running 2500k and are being recommended to keep them overclocked and get a better GPU. But when does it get to a point that a CPU must be upgraded to get better frame rates in 4k gaming?
In a funny way 4k could be a good platform to have if your on lowish budget as you will just need to upgrade your GPU if you have a 5 year old system already. I know GPUs are expensive but you could be quite happy on a Vega 56 or a Nvidia 1080.
I don't have 4k yet myself so this is just coming from stuff I read on here and lots of YouTube reviews.
As always, it's multi-faceted, and in truth it's going to be a combination of both CPU & GPU & Software. Here's what I'd say, in short, though: If you don't have at least a 2080 ti, don't worry about it. And remember, GPU demand increases with each new title. You're right on the budget thing though, I play at 4K and got myself a V64 and do fine with it and won't need to upgrade my platform for a long time to come (i7 6800k).
Is it accurate that you can game with no to very little AA on as 4k doesn't show jaggies like it does on other resolutions. If that's true that must be quite a few FPS saved?
No.
Oh really, I plan on using a Sony xf90 55" TV so that must mean jaggies galore?
I use a 55'' XF90. Jaggies depend on a per-game basis. You'll definitely want AA in general though, but luckily most AA nowadays is cheap in terms of performance and even the post-processing ones do a good enough job for the most part (SMAA & FXAA). Some games are exceptions and are horrible in terms of jaggies though and there's not much you can do about it (outside of some insane hardware to overpower it), e.g. GTA V. Also, in some cases the feature on the TV called Reality Creation can help with such exceptions, e.g. also in GTA V 1440p + Reality Creation aproximates 4K close enough while helping with the jaggies. Sort of like a pseudo-DLSS. Works better in some games than others, and it's free, so that's good.
Forgot to say, distance from the TV also matters, as you can imagine. I sit about 1.5m away from mine so I notice it a lot more. As you get closer to 1.3-1.4m you can even notice the pixels. From 2m or thereabouts you could use 1800p and not be able to tell the difference, or even lower. At such distances it's even worth thinking of using 1440p 120hz instead, or 1080p 120hz in case of HDR.
I find this to be very game dependent. There are some games where I can turn AA on low or even off and not notice any jaggies. But there are still some games where I have to leave AA on high.
It really depends how the game assets and textures are designed
The last PC game I played (AC Odyssey) - I couldn't notice any difference between High AA and Low AA, so I turned it to Low.
Low AA is a lower res temporal AA solution in Odyssey. There's a very noticeable difference in my eyes in terms of clarity & detail between Low & High. Especially noticeable for background detail, but not only. It's true though, you could easily play on low and not be bothered by it. It depends on how much you care about these things.