• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When does a CPU hold back 4k resolutions

Soldato
Joined
6 Feb 2019
Posts
17,595
Is it accurate that you can game with no to very little AA on as 4k doesn't show jaggies like it does on other resolutions. If that's true that must be quite a few FPS saved?

I find this to be very game dependent. There are some games where I can turn AA on low or even off and not notice any jaggies. But there are still some games where I have to leave AA on high.

It really depends how the game assets and textures are designed

The last PC game I played (AC Odyssey) - I couldn't notice any difference between High AA and Low AA, so I turned it to Low.
 
Associate
Joined
19 Feb 2018
Posts
152
I have a 1080ti twinned with a xeon x3470 (first gen i7 equivalent) overclocked to 3.8ghz. I use it as a couch gamer to play on my 4k tv at 60hz.

Forza horizon doesn't achieve a 60fps lock at 4k, and i know from benchmarks that a 1080ti should be averaging 70-80 fps at 4k at max settings so can only put it down to the cpu. Another game, Kingdom Come Deliverance can get frequently CPU bound in city areas and no matter how high i set the graphics setting there is always a slowdown in certain areas. These two examples are What is making me upgrade this summer as otherwise with my use case i would have been happy continuing as i was.

So my point is it entirely depends on the cpu usage of a particular game, heavily gpu bound games like shadow of war for example achieve 60fps at 4k lock even on my ******* cpu.
 
Associate
Joined
19 Feb 2018
Posts
152
I have heard this mentioned and in theory it stands to reason that it would be true. The higher pixel density should make jaggies hard to see. But I have never tested it myself. I would also be interested to hear whether users here bother with AA at 4k?

I personally don't bother with AA at 4k, it is very hard to tell the difference and i want to free up my gpu as much as possible. At 1440 tho i think its required
 
Associate
Joined
29 Oct 2002
Posts
698
I asked a similar question a couple of weeks ago, and it was identified as the GPU holding me back, as others have rightly stated, and that was with a 1080Ti!! I'm holding off for now, but will likely grab a 2080 at some point, it's just far too pricey with it being "current" tech :(
 
Associate
Joined
26 Aug 2016
Posts
561
I asked a similar question a couple of weeks ago, and it was identified as the GPU holding me back, as others have rightly stated, and that was with a 1080Ti!! I'm holding off for now, but will likely grab a 2080 at some point, it's just far too pricey with it being "current" tech :(
Wouldn't a 2080 just be a sidegrade from a 1080Ti? You probably want the 2080Ti, although I'd probably wait for the next generation if you already have a 1080Ti at this point.
 
Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
Is it accurate that you can game with no to very little AA on as 4k doesn't show jaggies like it does on other resolutions. If that's true that must be quite a few FPS saved?
Depends on screen size, I can't remember seeing much in the way of jaggies with my 27" Samsung 4K monitor but my current 40" 4K monitor definitely benefits from AA and I can see a real improvement with 8xMSAA vs 4xMSAA.
 
Soldato
OP
Joined
22 Oct 2004
Posts
13,381
I personally don't bother with AA at 4k, it is very hard to tell the difference and i want to free up my gpu as much as possible. At 1440 tho i think its required

I hope I'm the same then as that must be a bit less strain on the GPU. I've stopped bothering with ultra graphic settings on majority of games as the difference between high and ultra visually can be non existent but it helps on the frames.
 
Soldato
OP
Joined
22 Oct 2004
Posts
13,381
Depends on screen size, I can't remember seeing much in the way of jaggies with my 27" Samsung 4K monitor but my current 40" 4K monitor definitely benefits from AA and I can see a real improvement with 8xMSAA vs 4xMSAA.

Oh really, I plan on using a Sony xf90 55" TV so that must mean jaggies galore?
 
Associate
Joined
29 Oct 2002
Posts
698
Wouldn't a 2080 just be a sidegrade from a 1080Ti? You probably want the 2080Ti, although I'd probably wait for the next generation if you already have a 1080Ti at this point.

Sorry, yeah, that's what I meant. Apart from Raytracing (which I have no need for) it's likely not an upgrade at this point. I'm relatively happy with the rig as it stands so might consider replacing some older peripherals instead :)
 

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,028
Location
Devon
I have heard this mentioned and in theory it stands to reason that it would be true. The higher pixel density should make jaggies hard to see. But I have never tested it myself. I would also be interested to hear whether users here bother with AA at 4k?

I always leave AA on unless it impacts performance. Not come across anything that’s really troubled my 2080Ti until I picked up Ghost Recon Wildlands in a sale.

That mofo just destroys my GPU at 4K on Ultra settings. Even turning off AA didn’t make much difference and I could notice jaggies but usually AA makes very little difference at 4K.

However, I game on the sofa on a 58” TV from about 10’ away.
 
Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
Depends how far away you are too, I'm pretty close to the monitor I suspect with a TV you'd be further away. Using 1080ti SLI here. It's a trade off and also depends on how good your eyes are. My image looks better from about 1.3m but I can't sit that far away.
 
Soldato
Joined
11 Jun 2003
Posts
5,081
Location
Sheffield, UK
Screen below (the Acer Nitro 4K 144hz freesync). In most games, I find just the first level of AA still helps quite a bit. The jaggies are really rather tiny ofc but... still visible. Just the first bit of AA makes all the difference.
 
Soldato
Joined
6 Feb 2019
Posts
17,595
When You play 40 man raid in World of Warcraft :D

is WoW this single threaded and hates anything AMD? One of the reasons back in the day I switched to Nvidia and Intel was the horrible performance I had in WoW even on high end AMD parts
 
Associate
Joined
29 Aug 2004
Posts
2,381
Location
Alpha centauri
When it comes to the AA setting for 4K for me panel sizes below 40" and it is not needed, keep it of and go for higher FPS. also worth considering is the minimum FPS keep it high helps to reduce the stutter you get with spikes down in the frame count, fast memory and tight timing help here not faster CPU`s with current batch of top end Graphic cards.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
As title I keep seeing GPU is more important as it's more GPU bound at 4k when people want to get better performance out of their games at 4k.
Some people are still running 2500k and are being recommended to keep them overclocked and get a better GPU. But when does it get to a point that a CPU must be upgraded to get better frame rates in 4k gaming?

In a funny way 4k could be a good platform to have if your on lowish budget as you will just need to upgrade your GPU if you have a 5 year old system already. I know GPUs are expensive but you could be quite happy on a Vega 56 or a Nvidia 1080.

I don't have 4k yet myself so this is just coming from stuff I read on here and lots of YouTube reviews.
As always, it's multi-faceted, and in truth it's going to be a combination of both CPU & GPU & Software. Here's what I'd say, in short, though: If you don't have at least a 2080 ti, don't worry about it. And remember, GPU demand increases with each new title. You're right on the budget thing though, I play at 4K and got myself a V64 and do fine with it and won't need to upgrade my platform for a long time to come (i7 6800k).

Is it accurate that you can game with no to very little AA on as 4k doesn't show jaggies like it does on other resolutions. If that's true that must be quite a few FPS saved?
No.

Oh really, I plan on using a Sony xf90 55" TV so that must mean jaggies galore?

I use a 55'' XF90. Jaggies depend on a per-game basis. You'll definitely want AA in general though, but luckily most AA nowadays is cheap in terms of performance and even the post-processing ones do a good enough job for the most part (SMAA & FXAA). Some games are exceptions and are horrible in terms of jaggies though and there's not much you can do about it (outside of some insane hardware to overpower it), e.g. GTA V. Also, in some cases the feature on the TV called Reality Creation can help with such exceptions, e.g. also in GTA V 1440p + Reality Creation aproximates 4K close enough while helping with the jaggies. Sort of like a pseudo-DLSS. Works better in some games than others, and it's free, so that's good.

Forgot to say, distance from the TV also matters, as you can imagine. I sit about 1.5m away from mine so I notice it a lot more. As you get closer to 1.3-1.4m you can even notice the pixels. From 2m or thereabouts you could use 1800p and not be able to tell the difference, or even lower. At such distances it's even worth thinking of using 1440p 120hz instead, or 1080p 120hz in case of HDR.

I find this to be very game dependent. There are some games where I can turn AA on low or even off and not notice any jaggies. But there are still some games where I have to leave AA on high.

It really depends how the game assets and textures are designed

The last PC game I played (AC Odyssey) - I couldn't notice any difference between High AA and Low AA, so I turned it to Low.

Low AA is a lower res temporal AA solution in Odyssey. There's a very noticeable difference in my eyes in terms of clarity & detail between Low & High. Especially noticeable for background detail, but not only. It's true though, you could easily play on low and not be bothered by it. It depends on how much you care about these things.
 
Soldato
Joined
6 Feb 2019
Posts
17,595
I game on a TV sitting 1.5 meters from the screen. As such the visual difference needs to be significant for me to notice it. If I can’t noice it then I turn down the settings
 
Associate
Joined
5 Mar 2017
Posts
2,252
Location
Cambridge
Been there, went back for a 2k monitor. Actual processor Ryzen 1600x and XFX2080. At 4 k, the rtx2080 struggled, as other mentioned. CPU was a breeze. Bet my previous i7 4700 could hold easily, specially at 5.2Ghz, as unless something really specific or CPU bound, 4 cores and high frequency still king.
At 2k, g-sync monitor, when I cap the frames to 100 or even 75, the GPU stays nice and cool, without issues all max out.
 
Back
Top Bottom