Choosing monitor...4K or 1440p

Associate
Joined
8 Mar 2013
Posts
689
you are genius :) just turn settings down little example high and will hit 60fps, and plus this benchmark with reference card, with oc edition different story.

I have seen many reviews on gaming at 4K. The general rule of thumb is MEDIUM settings for smooth 60fps gameplay at 4K and the general consensus is that MEDIUM gaming at 4K is aesthetically inferior experience to ULTRA gaming at 1080p.

'OC' edition og GTX 980. will get you another 15% of frames. That is an extra 4-5 frames to add to the 29 FPS @ 4K. Not very good results for the £500 you will have just splashed out on that GPU; is it?

Only option is to go for dual GPU setup. "So 2* GTX 980, that will be £1000 sir!". "Oh, and even that will only just scrape playable frames with still comprormised graphical quality, on current AAA titles, the myriad of dual gpu comaptibility issues and other headaches notwithstanding. In another 6 months when the next tranche of more graphically demanding games come out, your £1000 of Sli'd gpus will be obsolete for 4K purposes, and you will be back looking to upgrade to the next £1000+ gpu solution in order to power your insanely overpriced, overkill 4K monitor.......have a nice day and see ya in 6 months with another £1000 in yer back hipper!"


Again another poor argument from MadTheCat, using the 980 for 4K is a bad idea due to it's 256 bus rate, if he knew anything about 4k this would be on his list. The 295X2 only uses 4GB for gaming not 8 as it has 4GB per GPU. Nice try yet again little buddy.


So you are essentially saying that you need Sli'd Titans to run 4K with decent gfx settings?

Just had a little squizz and it appears you are correct!

Look, 4 way SLI TITAN-Z almost nail Crysis on 'Very High' at 60fps.

2* TITAN-Zs ~ £3000+?

LOL!

EDIT: 2* TITAN-Zs = £5000!!!

WOW JUST *** WOW!

I rest my case.......(again).
 
Last edited:
Soldato
Joined
10 Nov 2006
Posts
8,551
Location
Lincolnshire
I have seen many reviews on gaming at 4K. The general rule of thumb is MEDIUM settings for smooth 60fps gameplay at 4K and the general consensus is that MEDIUM gaming at 4K is aesthetically inferior experience to ULTRA gaming at 1080p.

'OC' edition og GTX 980. will get you another 15% of frames. That is an extra 4-5 frames to add to the 29 FPS @ 4K. Not very good results for the £500 you will have just splashed out on that GPU; is it?

Only option is to go for dual GPU setup. "So 2* GTX 980, that will be £1000 sir!". Oh, and even that will only just scrape playable frames with still comprormised graphical quality, on current AAA titles, the myriad of dual gpu comaptibility issues and other headaches notwithstanding. In another 6 months when the next tranche of more graphically demanding games come out, your £1000 of Sli'd gpus will be obsolete for 4K purposes, and you will be back looking to upgrade to the next £1000+ gpu solution in order to power your insanely overpriced, overkill 4K monitor.

Seems you already made your own mind up back in August when you where looking for a new monitor that you couldn't afford 4K, nor did you know the difference between Hz and FPS, but who needs to spend £1k to run it? AMD will run 4K for £500/600 and with the latest price drops on the 7** series cards you can do the same with Nvidia.

I think the best place for you is the ignore list for most users of this forum, you type nonsense.
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
I notice also that the only card that gives remotely playable frames in the 4K benchmark is the R295X x2 8Gb dual GPU. What I would suggest however is that take BF4 onto multiplayer and that semi-playable framerate will soon plummet into a slideshow.

You Can cherry pick any results from the internet. Took me 10 seconds to find an old Bit tech review where the top card was the 7990 and that scores more than the 295 compared to your scores. I imagine you've used minimum frame rates there.

But that is a mute point, because I happen to know that BF4 (one of my main games), despite posting high frames, happens to HATE dual GPU configurations. microstutter city indeed!

You really haven't kept up with developments in frametime variance since Nvidia exposed microstutter with FCAT. It was fixed ALONG time ago. You may want to google MANTLE too. Frametime variance is now much better on both Nvidia & AMD cards with AMD sometimes takingthe lead.

At this point I will rest my case. 4K gaming at this point in time is for fools. You will spend an absolute fortune on gfx cards and the monitor itself, and end up with an inferior experience to someone gaming at 1080p. The evidence is all over the internet for anyone to see. Naturally, if someone wants to browse through the underbelly of a forum inhabited by gaming tech extremist junkies, then the reality may seem as though it has been turned on it's head......at least until you get your £3K worth of monitor and GPU and realise that you have just bought into a white elehpant.

Some of will spend a fortune on gfx cards and kit. Because we can and can afford to. Some of us read fully around any purchase from owners of new technology and I thank those early adopters for highlighting issuse to not just to the rest of us but the manufacturers.

I can play BF4 multiplayer at the settings I said above (missed one setting drop SSAO) and I love the experience. It is well documented that the move to 4k frokm 1080p is quite probably one of the biggest visual upgrades yopu can make at this current time. Seeing at the OP has afforded himself a 295. I think he can probably quite afford a £420 4k gaming monitor. I did and very happy and I've been gaming online since 56k and know my graphics.
 
Associate
Joined
15 Jun 2014
Posts
490
I have seen many reviews on gaming at 4K. The general rule of thumb is MEDIUM settings for smooth 60fps gameplay at 4K and the general consensus is that MEDIUM gaming at 4K is aesthetically inferior experience to ULTRA gaming at 1080p.

I don't need reviews I own myself 4k and play games on gtx 680, and some games max settings example wolfenstein new order end story

1080 vs 2160 gaming day and night
 
Soldato
Joined
26 Nov 2008
Posts
3,810
Location
Leeds
I play 4k on 2x 280x medium to high settings = 60fps for me and medium settings with max textures is far superior to ultra at 1080p in my experience. Running games at 1080p now looks like 800x600 to my eyes its just a blurry mess.
 
Soldato
Joined
1 Apr 2014
Posts
18,610
Location
Aberdeen
I have seen many reviews on gaming at 4K. The general rule of thumb is MEDIUM settings for smooth 60fps gameplay at 4K and the general consensus is that MEDIUM gaming at 4K is aesthetically inferior experience to ULTRA gaming at 1080p.

This is not my experience. I play at 4K on a single 780 Ti. Older games like Borderlands 2 can be played at max settings. Newer games like Tomb Raider can be played at medium-high settings (AA and TressFX off) and look just fine. If I were to go for 980 SLI - and I'm waiting on the 8 GB cards - I'm sure I could max those settings. No, I've not tried Crysis 3.

If I were starting afresh, I'd get myself a 4K monitor, a 144 Hz 1080p monitor, and that LG 21:9 1440 monitor and game on whichever one gave me the best experience in that game.
 
Associate
Joined
8 Mar 2013
Posts
689
This is not my experience. I play at 4K on a single 780 Ti. Older games like Borderlands 2 can be played at max settings. Newer games like Tomb Raider can be played at medium-high settings (AA and TressFX off) and look just fine. If I were to go for 980 SLI - and I'm waiting on the 8 GB cards - I'm sure I could max those settings. No, I've not tried Crysis 3.

You are confirming what I am saying. Your GTX 780 Ti is neck n neck with the the brand new GTX980 for the single GPU performance crown and you state/admit, that it is fine for older gamers and decent for newer games such as Tomb Raider (which is still 18 months old). But as stunning as Tomb Raider is, it isn't that demanding considering the quality of the graphics whereas games such as Crysis 3, Metro, and BF4 (busy multiplayer scenes which happen all the time and are in no way representitive of the campaign based benchmarks), are much more intensive and any single GPU PC would fall well short of the desired standard on a 4K resolution. So if the best single GPU that money can buy falls short of the standard required by the most demanding popular modern games at 4K resolutions, then how will it fare in 6-12 months down the line when we have Behemoths like Witcher 3 released?

That is all I was saying. Everyone who isn't a fan-boi can accept that this is the case.

If I were starting afresh, I'd get myself a 4K monitor, a 144 Hz 1080p monitor, and that LG 21:9 1440 monitor and game on whichever one gave me the best experience in that game.

And how would you work that little arrangement out in your PC room? Would you have those 3 monitors attached to some rotating tri-pronged bracket, which delivered your monitor of choice depending on which game you wished to play?
 
Associate
Joined
8 Mar 2013
Posts
689
Not at all. My games are up a notch or two from your claim of medium.

Pretty much like I have right now. Two of the monitors in portrait mode until needed in landscape mode.

It sounds like you can acheive the same frame rates with your 780 Ti at 4K, as I can with my laptop (GTX 780M) at 1080p.

Your Tomb Raider settings are the same as I have on my laptop. On that, I am getting 40-60 fps (overall a very smooth experience though) at 1080p. On my laptop, I can play many popular games with Very High settings, sarificing on stuff like MSAA, or perhaps shadows here n there. Then there are games such as Bioshock Infinite that I can run very well on my laptop using a mixture of settings, compromising on quality where I need to. But there are many other games that brings my laptop to it's knees if I try to go anywhere near 'very high'/ultra, or even just high settings at 1080p.

BF4, Crysis 3, to name but a few and I would imagine that it will cope even less efficiently with the next tranche of graphically demanding AAA titles due out over X-Mas and early 2015.

I would imagine it will be the same story with your GTX 780 ti at 4K.
 
Associate
Joined
3 Jun 2014
Posts
375
I own the ROG SWIFT and the Sammy 4k. If you are using just 1 GPU i would go for the 1440p ROG for sure. The TN panel is as good as the 4k ones and g-sync means you can still get smooth rates with max settings even under 60fps. At 2ft back or more 1440p using MSAA x4 or SMAA the image differences on games is not much unless you look closer and then you really see the 4k image difference on textures and jaggies and overall much nicer image. Plus input lag gets worse at lower frame rates. Unless you go for 2 or more gpu's 4k is not going to be that great on new demanding titles with the graphic options pushed up. It does really depend on viewing distance but at a certain distance 1440p with non blurry AA will look very similar to 4k but with better frame rates. These are my own opinions and i guess others may disagree. The ROG SWIFT seems like the perfect compromise as long as you don't sit 1ft away to notice the lack of image quality compared to 4k. Although you do have to rely on a game having a good choice of non blurry AA to use whereas at 4k AA is not really needed and keeps the image super crisp.
 
Soldato
Joined
4 Jul 2012
Posts
16,911
I own the ROG SWIFT and the Sammy 4k. If you are using just 1 GPU i would go for the 1440p ROG for sure. The TN panel is as good as the 4k ones and g-sync means you can still get smooth rates with max settings even under 60fps. At 2ft back or more 1440p using MSAA x4 or SMAA the image differences on games is not much unless you look closer and then you really see the 4k image difference on textures and jaggies and overall much nicer image. Plus input lag gets worse at lower frame rates. Unless you go for 2 or more gpu's 4k is not going to be that great on new demanding titles with the graphic options pushed up. It does really depend on viewing distance but at a certain distance 1440p with non blurry AA will look very similar to 4k but with better frame rates. These are my own opinions and i guess others may disagree. The ROG SWIFT seems like the perfect compromise as long as you don't sit 1ft away to notice the lack of image quality compared to 4k. Although you do have to rely on a game having a good choice of non blurry AA to use whereas at 4k AA is not really needed and keeps the image super crisp.

The ROG Swift at 144hz is roughly just as demanding as UHD at 60hz.

Also 2560x1440 will not look similar to UHD
 
Soldato
Joined
30 Nov 2011
Posts
11,374
The ROG Swift at 144hz is roughly just as demanding as UHD at 60hz.

Also 2560x1440 will not look similar to UHD

I agree with the first statement - to consistently hit 144hz, however with gsync you dont need to consistently hit 144hz, anything over 100fps average is very good

part two, you are right to an extent, however the "benefit" of 4k over 1440, at 27-28 inch sizes is limited, where as the benefit of high refresh over 60fps is very tangible
 
Associate
Joined
29 Oct 2002
Posts
806
I've got 2x 24" 1080p monitors and I'm looking to replace them with a single monitor.

I mostly use the two monitors when I produce music as I need the pixels to get everything on screen.

So I need something with a lot more pixels. I was looking at 4K but this thread has put me off as I also play games. I have a single 7950GT.

Anyone have any recommendations? I have seen some "super wide" screens on the market. Anyone using those for gaming?
 
Soldato
Joined
27 Jul 2004
Posts
3,520
Location
Yancashire
I don't get why everyone is arguing this still. At this moment in time with current GPU hardware, you cannot max out modern games and have a consistent high frame rate experience at 4k.
But, that's not to say you shouldn't get a 4k screen now if you really want. It could be argued its future proofing somewhat, as I've got a good feeling that nvidia's next release (20nm) will make 4k gaming a proper viable option. And these cards might be here within the next 12 months. You'll still probably need 2 though.
 
Last edited:
Soldato
Joined
4 Jul 2012
Posts
16,911
I agree with the first statement - to consistently hit 144hz, however with gsync you dont need to consistently hit 144hz

That was my point really, it seems pointless to have 120 or 144hz monitors if you're not striving to consistently hit that
part two, you are right to an extent, however the "benefit" of 4k over 1440, at 27-28 inch sizes is limited

If you're blind
 
Back
Top Bottom