• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

4K gaming likely to be doable in 2013?

You make a fair point if you're sitting on your couch a fair few feet away. I however play about 2 feet away at best from my 27'' Monitor (2560x1440) all the higher res juice I can get in the future will be more than welcome. I'll end up getting a 4K Monitor/TV once I've seen one in the flesh and see how it weighs up to what I'm running. I also have no idea what the resolution would do to sports games.. would it just show the whole stadium in one image ?

you've got 108 ppi now, on a 27 inch screen

a 4K display at say 36 inches would be 128ppi as well as 78% larger
 
2560x1440 needs a pair of 7950/70 to run the latest games at max with some to spare, 4k is just over double the pixel count so quadfire should cover it :P

An increase in resolution doesn't linearly increase the power requirement to push that number of pixels.

You're misunderstanding that,

Where it says 120hz this is not mean it can do 120fps.

The 120hz is actually really low for a modern screen most new decent lcd's are running at around 800hz and plasmas 3000hz.

That's not really how it works.

The frequency the panel works at is directly related to the amount of FPS visible on a display.

Hertz are basically cycles per second, so a display panel that is capable of 120Hz is actually 120FPS.

What you're misunderstanding is that these LCDs and plasmas are interpolating, or rather basically faking the effect of the way higher Hz would display on a screen.

The marketing term for some displays is motion compensation. It's basically an algorithm that attempts to generate fake frames to simulate the effect of more FPS. But it's certainly not a display that is *really* running at the Hz you've suggested, despite saying I'm misunderstanding. :p
 
it is both, the receiver on the TV is HDMI 1.4, so it will only support 30hz input - though the TV can interpolate to 120hz, this would cause severe input lag in gaming

display port 1.2 can output at 60hz, but this TV can only receive at 30hz, we need TV's that have HDMI 2.0 or monitors (like the Sharp one) that have DP 1.2

Is this confirmed anywhere, or just speculated? Because at present, there aren't any devices with HDMI 2.0 outputs.

Considering HDMI 2.0 should already be on devices in production, it'd make sense that realistically they equipped this TV with HDMI 2.0 since the spec itself has been done for a little while.
 
the spec listing for all 4K TV's that have been announced says HDMI 1.4 only, pcper tested one and it only works at 30hz unless you can work out a way to use multiple cables
hdmi 1.4 does not support 4K @ 60hz

all the news items mention hdmi 2.0 as being needed for 60hz

several of the current 4K devices support 2 or 4xHDMI cables from source to display and combining them to get a picture, but getting it to actually work with a PC for gaming is somewhat problematic

that's why it will be interesting to see if when they do release HDMI 2.0 displays, if graphics card manufacturers or someone release an active adapter for DP 1.2 to HDMI 2.0 to get it working, instead of having to buy more expensive PC 4K monitors with DP 1.2
 
Last edited:
An increase in resolution doesn't linearly increase the power requirement to push that number of pixels.



That's not really how it works.

The frequency the panel works at is directly related to the amount of FPS visible on a display.

Hertz are basically cycles per second, so a display panel that is capable of 120Hz is actually 120FPS.

What you're misunderstanding is that these LCDs and plasmas are interpolating, or rather basically faking the effect of the way higher Hz would display on a screen.

The marketing term for some displays is motion compensation. It's basically an algorithm that attempts to generate fake frames to simulate the effect of more FPS. But it's certainly not a display that is *really* running at the Hz you've suggested, despite saying I'm misunderstanding. :p

I totally understand this, you was the one who said the tv you linked to was capable of 120hz.
 
you've got 108 ppi now, on a 27 inch screen

a 4K display at say 36 inches would be 128ppi as well as 78% larger

Yeah I thought as much but it just makes sense when you have a 2-3 monitor set up,for example FIFA shows more of the game from one side of the pitch to the other. The fact a 4K TV will likely do the same, will be a little unnatural at first.. imagine that though one screen showing the entire pitch :cool: that's the exact type of experience I'm willing to pay through the nose for, unless I have no idea what I'm talking about ?
 
the spec listing for all 4K TV's that have been announced says HDMI 1.4 only, pcper tested one and it only works at 30hz unless you can work out a way to use multiple cables
hdmi 1.4 does not support 4K @ 60hz

all the news items mention hdmi 2.0 as being needed for 60hz

several of the current 4K devices support 2 or 4xHDMI cables from source to display and combining them to get a picture, but getting it to actually work with a PC for gaming is somewhat problematic

that's why it will be interesting to see if when they do release HDMI 2.0 displays, if graphics card manufacturers or someone release an active adapter for DP 1.2 to HDMI 2.0 to get it working, instead of having to buy more expensive PC 4K monitors with DP 1.2

Yea whats new though with DP and HDMI.There is no spec for what we used to enjoy ten years ago on a CRT.That really blows my mind as to how stagnant picture quality has become.Sure thats use less power and are more flat but even the old cheap CRT's wipe the floor with 1440p displays currently.And who the hell wants 30hz? 24HZ in blu-ray has enough judder so thats not enough for 48p hobbit at 4k res.


Again this is not the TV but the HDMI spec.Its been useless since it was invented.Hell it cannot even do 2560x1440 @ 120hz which is what all those yamakasi people are using currently.And they wonder why thier market shares and revenues are falling.The whole thing is an outdated shambles in the name of money and royalty payments.


Just get a new future proof spec able to do 120hz up to 4K and let it be for a few years.Instead they will probably drip feed HDMI 2.0 and then 3.0.And thats not even taking into account how many years we will have to wait to get CRT display quality back into our living rooms.OLED has sample and hold motion blur so OLED is not going to match CRT.What will? And will it be here before 2020? We are talking about a 20 year period here its not been 5 years or so since CRT died which simply blows my mind.
 
Last edited:
Yea whats new though with DP and HDMI.There is no spec for what we used to enjoy ten years ago on a CRT.That really blows my mind as to how stagnant picture quality has become.Sure thats use less power and are more flat but even the old cheap CRT's wipe the floor with 1440p displays currently.And who the hell wants 30hz? 24HZ in blu-ray has enough judder so thats not enough for 48p hobbit at 4k res.


Again this is not the TV but the HDMI spec.Its been useless since it was invented.Hell it cannot even do 2560x1440 @ 120hz which is what all those yamakasi people are using currently.And they wonder why thier market shares and revenues are falling.The whole thing is an outdated shambles in the name of money and royalty payments.


Just get a new future proof spec able to do 120hz up to 4K and let it be for a few years.Instead they will probably drip feed HDMI 2.0 and then 3.0.And thats not even taking into account how many years we will have to wait to get CRT display quality back into our living rooms.OLED has sample and hold motion blur so OLED is not going to match CRT.What will? And will it be here before 2020? We are talking about a 20 year period here its not been 5 years or so since CRT died which simply blows my mind.

frist manufacturer that comes up with a 120hz 4k and sells it will dominate.
 
I think 4K is overkill, on a 55" screen 2560x1440P or even 1920 x 1080P is perfectly good.

Unless your sitting 6 inches away from it....

yes but also, it's not the resolution, it's the low quality of the current graphics modelling that makes it a waste of time..

i.e a coffee mug/ ash tray/ table and chairs will still look rubbish in 4K OLED, just as it looks rubbish right now !!!!

games aren't ready for 4K, because only a game with three times the quality/detailing of Crysis 2 will look any good and you wont see this for years yet !

but a higher resolution looks better on a bigger screen....as for 4K TV, not in your lifetime....never !
 
the spec listing for all 4K TV's that have been announced says HDMI 1.4 only, pcper tested one and it only works at 30hz unless you can work out a way to use multiple cables
hdmi 1.4 does not support 4K @ 60hz

thats why you get a 4k monitor and not a tv, as a monitor will come with quad dvi output and not the poor hdmi rubbish, and also with a tv chances are 56" - 60 will come out first so your just getting a bigger screen as opposed to better pixel density, there are 30" 4k monitors being made albeit a bit more expensive, but they will come down in time,

personally i cant wait, ive had a 30" 1600p monitor for over 7 years now and i am very keen for the next step up.

oh and as for 4k gaming being doable in 2013, well its 2013 now and people with triple 30" screens are pushing 12 million pixels around, whereas 4k will only be 8 million pixels ( roughly ), so its doable now and has been for a while.
 
Last edited:
Yea whats new though with DP and HDMI.There is no spec for what we used to enjoy ten years ago on a CRT.That really blows my mind as to how stagnant picture quality has become.Sure thats use less power and are more flat but even the old cheap CRT's wipe the floor with 1440p displays currently.And who the hell wants 30hz? 24HZ in blu-ray has enough judder so thats not enough for 48p hobbit at 4k res.


Again this is not the TV but the HDMI spec.Its been useless since it was invented.Hell it cannot even do 2560x1440 @ 120hz which is what all those yamakasi people are using currently.And they wonder why thier market shares and revenues are falling.The whole thing is an outdated shambles in the name of money and royalty payments.

HDMI is fit for purpose - TV's... as you mention, blu-rays are at 24hz, so HDMI 1.4 never needed to be at 60hz at this res

display port 1.2 has been out for quite a while and already supports 4K @ 60hz, but so far the lower cost 4K displays haven't supported it

thats why you get a 4k monitor and not a tv, as a monitor will come with quad dvi output and not the poor hdmi rubbish, and also with a tv chances are 56" - 60 will come out first so your just getting a bigger screen as opposed to better pixel density, there are 30" 4k monitors being made albeit a bit more expensive, but they will come down in time,

personally i cant wait, ive had a 30" 1600p monitor for over 7 years now and i am very keen for the next step up.

oh and as for 4k gaming being doable in 2013, well its 2013 now and people with triple 30" screens are pushing 12 million pixels around, whereas 4k will only be 8 million pixels ( roughly ), so its doable now and has been for a while.

quad DVI doesn't really work for gaming either, we need DP 1.2 devices, which are due out now-ish

I want 4K to have both a slightly better DPI but also a much bigger screen... I want immersion... 7680x1440 is ok, but I'd prefer something about 40" in 4K for gaming

it isn't doable now in as much as sub £20,000 displays for gaming are not yet available, but should be this year... having the graphics cards and no display is not what I would call fun :D
 
no m8, 4k refers to the resolution 3840 pixels × 2160 pixels
there is also 8k being produced which is a resolution of 7680 × 4320
 
no m8, 4k refers to the resolution 3840 pixels × 2160 pixels
there is also 8k being produced which is a resolution of 7680 × 4320

Ok thanks for clearing that up. But is that an industry standard? Are we going to get console games that do 4k or we getting tricked again like 1080p
 
Size is the problem. I don't want a monitor larger than 30".

Now, if they can release a reasonably priced 3840x2400 monitor (16:10) of that size in a few years, I'd be all over it.

Ok thanks for clearing that up. But is that an industry standard? Are we going to get console games that do 4k or we getting tricked again like 1080p

There are many "4K" resolutions. It's becoming even more ridiculous than it was with Full HD/HD Ready TVs, since 1080p at least indicated the supposed vertical resolution of the content/display. But then again, a 2.35:1 movie would have an effective resolution of 1920x817 with black bars to fill in the blanks. And that's considered Full HD too.

There were "HD Ready" plasma TVs that were 1024x768 due to the way the display worked. Definitely less real pixels than an HD resolution would indicate (1280x720), and yet it was considered an HD display. Most laptops are 1366x768. Multiple console games are upscaled from sub-HD resolution.

Content is the problem IMHO. The "industry" standard for 4K Ultra High-Definition (which doesn't even indicate the vertical resolution) is 3,840 × 2,160. And yet I have no doubt this will vary depending on content anyway.
 
Last edited:
4K, over rated for anything but video content that is FILMED in 4k.

Waste of time otherwise.

Every film in the last 80 years would easily provide 4k or higher.
35mm goes well over 4k and 70mm goes well over 8k even up to 12k in some cases.
The big problem is that we need someone to digitize it all.
 
Back
Top Bottom