Thinking about 4K, am I missing a trick or is it not worth it?

Soldato
Joined
6 Sep 2005
Posts
3,781
Hi everyone

With the slew of new monitors coming on to the market and a lot of our monitors now wheezing I was thinking about the benefits of moving to a 4K monster, but the more I read about it the more I'm doubting it...am I missing a trick?

We (the missus and I) do a lot of Photoshop and video editing work (nothing 4K as yet, but I don't want to be behind the curve when we are inevitably asked about it).

Our main monitors are currently:
Dell 2405 24” (with a dodgy indented power button :/)
Dell 2407 24”(with a dodgy spinning power button)
Dell 2709 27” (working fine at the moment, fingers crossed)

We bought the monitors at the time because of their even back lighting and accurate colour reproduction which has been invaluable when editing and printing. All of our colour correcting before printing is displayed on the 2709 which is a bit of a pain as it slows down the process having to check it all on just one monitor.

When not working we like to game as well and the thoughts of 4K gaming goodness seem appealing, our current highest PC is:

i7 3770K
Gigabyte Z77X-D3H
32GB DDR3
Geforce GTX 670
Windows 7 64bit

The more I read about 4K the more problems I seem to see with it and I'm not sure of it's potential yet, or if we (everyone but the hardware manufacturers) are ready for it.

I'm pretty sure that my PC won't run 4K at anywhere near reasonable frame rates with full graphical effects, I think a major upgrade would be needed....?

I've read that Windows 7 isn't built to handle those resolutions and doesn't work well at them because everything is too small. Windows 8 is fine, but with a major release of Windows 9...er...10, on the horizon who wants to change OS now.

UHD (3840x2160) seems to have been the resolution adopted by the market for monitors and TVs at the moment, but cinematic 4K is running at 4096x2160, so are we going to be getting films released in resolutions that the current screens can't support properly? The only monitor I can see on OCUK (and they seem to have a much wider range of 4K than the competition) is an LG which runs at true cinematic 4K.

So...
Games need a big upgrade to run
W7 OS can't display it properly
The monitors and TV's aren't running full 4K resolution.

I haven't even got as far as looking how good the screens are in comparison to what I use as far as colour accuracy and even lighting are concerned, I would hope that they blow my older monitors out of the water!

So tell me OCUK, what's the big benefit to 4K because at the moment I'm not really seeing it.
 
I don't think you're missing a trick.

For hardcore gamers on mere mortal budgets 4K requires too expensive hardware to drive. For those 4K is probably 1 or 2 years off.

For people with less than excellent vision, 24 or 27" may cause the UI to be rendered too small, and for now Windows scaling is nowhere near what it should be (Have a look at Apple, Microsoft?)

As for UHD vs 'true' 4k... That I deem irrelevant, cinema has too many aspect ratios going on ;)

For me personally: 4K is ready with the introduction of 'affordable' 24 and 27" IPS panels using SST rather than the initial MST workaround, as I'm absolutely fine with 160dpi @100% in windows (Which is exactly what my Asus ux32vd does)
 
There isn't anything to miss, it's just a resolution. I t hink where it's at though is bigger monitors. I'm moving to 40" UHD monitors from 3x 27" QHD ones when those Phillips ones are out.

Your GPU is what will need to be upgraded, the CPU you have should be fine for the most part. GTX670s are severely limited on the memory bandwidth and quantity front.

As for 4K and UHD, they're not the same thing. It's not to do with aspect ratios specifically. 4K is a standard, as is UHD. 2K is a thing, and it's 2048x1080 but we never called FHD "2K" (at least to not any great extent). So I find it weird that 3840x2160 is being called 4K.

That being said, the reason UHD is the resolution it is is because it's 16:9, and it's a straight multiplication of an existing resolution in use, so it's easier to make UHD displays.
 
I agree. UHD is ahead of the times, there is little to no UHD content out there. The only way i see you really benefiting is your PS work.
Unless your editing for big companys with RED, i wouldnt worry about keeping up with footage quality. Its out of the average consumers scope currently.
Would just like to point out footage at the cinema is 4k. So for regular consumers to expect to be working/viewing/playing at this resolution i think is kind of unrealistic, of course unless money is no object.

Your graphics card will not support gaming at 4k. Very little 4K content (RED-ray anyone?). I would just go for regular HD or WQHD. Also giving it some time will only allow prices to decrease and technology to get better.
i remember reading a review of an early 4k monitor that was really two panels that were capped at 30hz and just being like, seriously, what is the point. Its really gimmicky atm.

The extra cost of a 4k would be better put in to a higher quality lower resolution panel.
As a side note i read that a 5K panel is in the works, i just dont understand who this is aimed at.

Anyway, short answer, dont bother :)
 
I agree. UHD is ahead of the times, there is little to no UHD content out there. The only way i see you really benefiting is your PS work.
Unless your editing for big companys with RED, i wouldnt worry about keeping up with footage quality. Its out of the average consumers scope currently.
Would just like to point out footage at the cinema is 4k. So for regular consumers to expect to be working/viewing/playing at this resolution i think is kind of unrealistic, of course unless money is no object.

UHD isn't ahead of the times, it's just a resolution. Content doesn't really matter that much. It's about image quality first and foremost.

Your graphics card will not support gaming at 4k.

UHD, and yes it will, it will just struggle with recent games. If you play old games, a 670 would be fine at UHD.

Very little 4K content (RED-ray anyone?). I would just go for regular HD or WQHD. Also giving it some time will only allow prices to decrease and technology to get better.
i remember reading a review of an early 4k monitor that was really two panels that were capped at 30hz and just being like, seriously, what is the point. Its really gimmicky atm.

There is more UHD content than WQHD content, so your argument for content doesn't really make sense.

Prices are dropping, and the technology doesn't "need" to get better in comparison to the likes of WQHD IPS panels, which are also at 60Hz.

Early panels you talk of were due to the controller boards being incapable of controlling a UHD panel. That's no longer an issue, and it's not gimmicky at the moment at all.

It's JUST a resolution, more pixels = more detail. Talking about 30Hz displays and saying "At the moment" is disingenuous, as there isn't a 30Hz issue any longer.

The extra cost of a 4k would be better put in to a higher quality lower resolution panel.

This entirely depends on what you want. If you were, for example, happy with using a 40" monitor, then Phillips is releasing a VA 60Hz UHD one at a price that competes with lower resolution IPS displays, such as the Dell UltraSharp U2713H, and has a slightly higher pixel pitch too, meaning ever so slightly sharper image.

As a side note i read that a 5K panel is in the works, i just dont understand who this is aimed at.

They are already done, Apple are putting them in their new iMacs and Dell has a monitor out with the same panel in it.

They are aimed at people who want more image quality. It really isn't about what content is available. More pixels always looks better/sharper on the same panel types (up to about 600 pixels per inch depending on viewing distance).

A monitor at 7860x4320 would look better than a UHD one, even if they were both running at 3840x2160; because more pixels.
 
is 28" 4k really massively better than a 34"(27" height) 21:9 monitor ? PPI wise its over double but then again can anyone see the pixels on a 1440p 27" screen from a typical 60 - 70cm away ?

its entirely possible to drive 4k from just one top end GPU with the settings down but in SLI/CF it should be fine,

is it hugely noticeable going from 1440p to 4k when the screen is only 1" bigger ? Id imagine you want a 32" type size for 4k to get a better overall impression of the extra detail.
 
I've tried a 5k iMac and it was lovely to use, sadly I then tried a 4k display on Windows and it was almost unusable due to scaling problems.

I think 4k for games would be great if you have the required GPU power but for day to day use I'd probably end up running at 1080p instead.

For most people with reasonable budgets I'd probably lean towards a quality 1080p or 1440p monitor until OS and App support for 4k+ is more widespread. By that point it should also be more affordable to get the necessary GPU grunt required.
 
I considered going 1440p but decided I could make the jump to 4K as cheaply, even though I only have one 290 I have surprisingly found it copes very well with older games, particularly RPGs, like Fallout games. Also found Fallout NV supports 3840X2160, as does Dishonored, two of my favourite games. I have also decided to wait for the next gen of graphics cards as I suspect I would have heat problems if I went crossfire 290. I was finding my 290 temps hitting 93c and 85c (Fallout NV) however I removed the heatsink/fan and found very poor paste so I put decent paste on and now find my temps are down a good 10-12c. Win 8.1 seems ok with 4K but I have also found that 1080p looks great on my monitor so I can still play some modern titles. All in all I have been happy with my purchase, after some teething problems, and have no regrets at all. :D
 
Well one things for sure I would wait it out if you are thinking of getting a new monitor lot of change coming over the next 6 months.
 
I've tried a 5k iMac and it was lovely to use, sadly I then tried a 4k display on Windows and it was almost unusable due to scaling problems.

+1

I've nothing against high res displays but, even ignoring the GPU grunt required for it whilst gaming, Windows font scaling is flat out broken just now and makes them unusable even whilst trying to use fairly common productivity apps and quite a few built in Windows features.

If you're not fussed about gaming then the iMac 5k is probably going to be a much better bet for the next year or two. Windows 10 will hopefully improve things but even then it's gonna take 3rd party devs at least another 6 months to catch up after that. Hopefully going forward developers will be able to put proper scaling in place that can support ever increasing DPI displays so with any luck issues like this will be a thing of the past.

1440p is the highest res you'll be able to run without issue on a Windows monitor for at least the next year IMO.
 
Last edited:
So 4k is actually not appropriate due to Windows itself?
Hope my u2711 can hold out long enough. It is dying and I don't really want a new monitor that isn't 4k at the present time
 
I don't think 4K is ahead of its time...most like our graphic card industry is very far behind its time :p

We've been stuck on the 1920 res for like what...since 2006, almost a decade now. GPU makers are still beating around the bush in just creating "high-end" graphic cards that's still can't play some demanding games at max settings at this res :p
 
We've been stuck on the 1920 res for like what...since 2006, almost a decade now.

Yeah, but it wasn't until 2010 that single GPU's came along that could run games at high/max settings at decent frame rates.

Unless its for productivity work, people should just forget about 4K and go for the best 27/32' 1440p monitor they can afford.

Its the perfect match for current GPU's.

4K won't be in the same category for another 2 generations of GPU.
 
1440p is gonna be popular for a while yet. If I were to upgrade today I'd be looking at a 3440x1440 34" 21:9 monitor rather than going to 4k. That way I'd get a load of extra screen estate which could replace my existing dual monitor setup [24" 1080p AW2310 and 20" Dell (1680x1050)] and at the same time avoid any windows scaling issues. Plus it's slightly easier on the GPU then 4k and games which don't support 21:9 can easily be run at 16:9 on the monitor, albeit with bars either side, still giving the same screen size as a 27" monitor.
 
Last edited:
What are these windows scaling issues? I noticed windows 8.1 lets you upscale to 200% from the Win 8 max of 150%. Doing this certainly made quite a difference to text and icon size in 4K, which even though my eyes aren't the greatest I found 200% a tad too big and 150% just right. I'm sitting 2ft-3ft from my monitor and have no difficulty reading this text without my glasses. If 1440p monitors were cheaper then maybe I would have bought one but with 4K here now and not much more, if any, than 1440p then it seemed a no brainer. I was worried about not being able to game at 4K but have been pleasantly surprised to be honest. Also find my monitor looks great at 1080p so I can game at that if need be until I can upgrade my graphics. I can understand the extreme fps shooter type gamers being not too keen and those who prefer 120\144Hz screens as that will require some serious muscle. I do think the graphics card manufacturers have been slow to react to the emergence of 4K monitors. :)
 
Thank you everyone for your posts, it's been a really interesting read and has certainly given me some things to think about.

Firstly, 4K gaming is definitely out of the window. Graphics cards just aren't up to it yet so there's just no point unless spending an absolute fortune which isn't an option.

If I was going to use 4K for anything it would be for the photo and video work I do, scaling etc., does sound like a big problem...maybe that'll be nullified by a bigger solution in the Philips 40" monitor that is being talked about in another thread! :D

In the meantime I'm learning a heck of a lot about monitors that I didn't know about a week ago..UHD, the old cheating way of effectively two panels in one screen and 30Hz refresh rates...the limitations of HDMI.

I've only been out of the monitor tech loop for a few years (been perfectly happy with a Dell 2709), it's amazing how technology has moved on in that time!
 
Back
Top Bottom