• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Mulling over an upgrade for 4K

It's not just about FPS, if the game runs smooth for you @ 4K even at 30fps with gsync then it's worth it.

I agree with that but i just find myself becoming a bit of a fps whore wish I never went around my mates and played on his 144hz panel. Its even the mouse pointer annoying me now.
 
Get 4K run your GPU, if you like what you see happy days, if not upgrade, ignore all these people that say you need to sell your mum and your left ******* to run 4K, you know how you like to game crack on.

I'd rather play in 4k at 40FPS than any less resolution, once you've gone 4k you'll never go back, I would recommend a G-Sync monitor does wonders for lower FPS.

I am going back :) but hence the reason why i am going 1440p 144hz a sort of trade off.
 
You just tend to pay more per pixel for 1440 than 4K. And surely in a few years 4K will just be a standard setup.

Probably be something bigger on the horizion. I think game detail will be always incresing so i doubt it will ever be cheap to run 4k at least not for the forseable future.

But thats just the nature of it. I do wonder what the next console generation is going to be like if they go 4k the graphical improvement over this generation might be very disappointing.
 
I pull solid framerates at high settings on almost everything at 4k on my pair of 7970s, generally 50-80fps depending on the game, very rarely did it dip below 30. I ordered a Fury X the other week though just to move to a single card solution - lose some frames but it leaves me open to pick up a second Fury X if I feel I need it.
Given that 7970/280/290 are going cheap second hand now, a pair of them is a cheap and effective way to go 4k imho - especially with freesync 4k becoming available.
 
I pull solid framerates at high settings on almost everything at 4k on my pair of 7970s, generally 50-80fps depending on the game, very rarely did it dip below 30. I ordered a Fury X the other week though just to move to a single card solution - lose some frames but it leaves me open to pick up a second Fury X if I feel I need it.
Given that 7970/280/290 are going cheap second hand now, a pair of them is a cheap and effective way to go 4k imho - especially with freesync 4k becoming available.

You must be playing older games...

When I first got a 4k screen I had an AMD 295x2 and when there wasn't a crossfire profile you'd be hard pressed to get 30fps.
 
You must be playing older games...

When I first got a 4k screen I had an AMD 295x2 and when there wasn't a crossfire profile you'd be hard pressed to get 30fps.

Uh, Dragon Age Inquisition, Witcher 3 (with the hairworks 8x limit), GTAV... Evolve is the only game I've had to put down to 1440 for playability, not entirely sure on why - though I've not played it for a few months so it could have been fixed in drivers. My cards are clocked at 1.2ghz though and my cpu at 4.8, could be the difference in our setups?

Witcher 3 was near unplayable until the new drivers for it though... GTAV saw decent improvement too with driver updates.
 
Last edited:
4k 40" is amazing, 1440p at this size doesn't cut it & 1080p is a joke.

I'm really really impressed with how clean games look & with 980ti SLI 60fps is achievable in even the newest games (Witcher 3 only requires you to drop the hairworks AA & one or two settings to high).

I'm expecting 4k to become the next standard anyway, so may as well prepare now - while 1440p is excellent, it's a mid-way.
 
Thing is, I keep reading this, but if my GTX 760 copes then I have to conclude Ti SLI isn't essential. I'm sure you whack everything up to ultra at 4K then you can make a Titan X squirm, but the net benefit is rarely all that when you compare the screenshots (IMO).

What's the point in choosing resolution over textures, tessellation and effects? You're just going to be polishing a turd. Surely it would be best to find a middle ground?
 
Uh, Dragon Age Inquisition, Witcher 3 (with the hairworks 8x limit), GTAV... Evolve is the only game I've had to put down to 1440 for playability, not entirely sure on why - though I've not played it for a few months so it could have been fixed in drivers. My cards are clocked at 1.2ghz though and my cpu at 4.8, could be the difference in our setups?

Witcher 3 was near unplayable until the new drivers for it though... GTAV saw decent improvement too with driver updates.

I have a 5930k @ 4.4Ghz, so probably not a CPU issue.

And you're using the in-game presets?

So you're saying that Witcher 3 on 'high' at 4k runs at 50-80 fps on crossfired 7970's?

And the same for GTA V on 'high'?

Sounds unlikely to me, GTA V will surely surpass the 3GB VRAM you have? It uses about 6GB on my PC!

I thought the 7970's were borked because they use the old Crossfire connector. Aren't the frame-times horrendous?
 
Thing is, I keep reading this, but if my GTX 760 copes then I have to conclude Ti SLI isn't essential. I'm sure you whack everything up to ultra at 4K then you can make a Titan X squirm, but the net benefit is rarely all that when you compare the screenshots (IMO).

Hmm. I'm running a 5820k rig with two Titan Blacks at 4k and Witcher 3 didn't look all that. I had to run it at medium, and 1440p on the same monitor at maximum looked much better.
 
And you're using the in-game presets?

So you're saying that Witcher 3 on 'high' at 4k runs at 50-80 fps on crossfired 7970's?
Just loaded up and the area I'm in is running 35-55fps, occasional dips to 20-25. It's not preset high, I've disabled some post processing that do nothing but make the image quality worse :/ All the sliders are high, a few on ultra (like Detail Level, Texture Quality)
I thought the 7970's were borked because they use the old Crossfire connector. Aren't the frame-times horrendous?
One of the biggest reasons I'm moving to Fury - some games are silky smooth but some look janky despite reporting 50+ frames.
 
True, but there's less need for anti-aliasing, ambient occlusion, DSR, etc.

THIS. This is the exact reason you run 4K with lower settings, you CAN lower certain settings with a higher resolution and those settings effects are identical but the rest of the game look betters.
 
THIS. This is the exact reason you run 4K with lower settings, you CAN lower certain settings with a higher resolution and those settings effects are identical but the rest of the game look betters.

For some people you are flogging a dead horse unless they can get 144mhz 100+fps 4k Ultra everything then they think no one else should have 4k and enjoy it or be able to get a decent gaming experience.
When I went from 1440p to 2160p it was like WOW everything was so sharp and clear. At first I was in a game just running upto things and looking at the detail. Driving along in say Far Cry 4 you actually look out the window and admire the scenary. And yes at 50fps+.
AA you can get by on the lowest setting, AO I dont often bother with but in some games I turn it on and I get no drop in fps but then again I dont see the difference.
Its horses for courses. Just enjoy your 4k experience now and it will only get better with age.

Just dont buy ANY Acer 4k monitors just because you want gsync. ;)
 
Supersampling and anti-aliasing are the two things I can think of that you can do without at 4k.

Everything else is still needed
Yeah, these I skip across the board - you can't really tell the difference.

It's the models, textures & scenery that really benefit from 4k - the most important assets. It becomes even more crucial the larger monitor you go for.
 
Back
Top Bottom