Gaming at 8k?

Soldato
Joined
1 Apr 2014
Posts
19,193
Location
Aberdeen
Linus Tech Tips have a video here:


I think it's a very poor video because it only looks at fast-paced FPS games - Halo, Doom, and CS - and doesn't look at any other genre. Granted that DLSS is based on 16k textures but the game has to support it. Various commmenters also bring up the lack of 8k textures, adjusting the FOV, increased distance vision and more.
 
The thing most people will gloss over is the actual cost of the displays, that LG they have for this video was 30k on it's own and they only have it as LG sent it them (is a throw away line in the video)
 
So Nicole raised a good point, that got me thinking. How would their perception change if the content/games was designed for 8K?
A lot of films aren’t even captured in 4K. Games have to compromise texture resolution due to hardware limitation so arguably they aren’t even taking full advantage of 4K.

Fast forward 10-15 years down the line, when 4k is finally truly mainstream and 8k is becoming popular. More content is designed for consumption at 4K minimum with some bells and whistles for the 8k viewer, would the results of this experiment change.

I know he mentioned minimum viewing distance to distinguish between the individual pixels, however I can’t help but feel this is focussing on the wrong thing and that the picture as a whole is more important.

I guess a way to test this is to take a 1440p screen and play content that is 1080, 1440 and 4K and see if people can tell the difference at the distance where a person cannot see the individual pixels.
 
So Nicole raised a good point, that got me thinking. How would their perception change if the content/games was designed for 8K?
A lot of films aren’t even captured in 4K. Games have to compromise texture resolution due to hardware limitation so arguably they aren’t even taking full advantage of 4K.

Fast forward 10-15 years down the line, when 4k is finally truly mainstream and 8k is becoming popular. More content is designed for consumption at 4K minimum with some bells and whistles for the 8k viewer, would the results of this experiment change.

I know he mentioned minimum viewing distance to distinguish between the individual pixels, however I can’t help but feel this is focussing on the wrong thing and that the picture as a whole is more important.

I guess a way to test this is to take a 1440p screen and play content that is 1080, 1440 and 4K and see if people can tell the difference at the distance where a person cannot see the individual pixels.
Most films are filmed in at least 4k. If you're talking about actual 35mm film then that can usually produce 4k footage once scanned without digital upscaling.

As for the 4k gaming test, they sort of test that here:


1080 vs 1440 vs 4k.

Some people could tell the difference between the resolutions, but most didn't really care. It seems a higher refresh rate was more preferable than higher resolution, but it was a fast paced FPS. I can imagine 4K being better with an RTS.
 
Most films are filmed in at least 4k. If you're talking about actual 35mm film then that can usually produce 4k footage once scanned without digital upscaling.
I'll watch the video you linked later but regarding this statement. Yes there are a few directors who are still using film to capture movies but from what I remember most have been using digital for a while now, specifically using ARRI cameras. And ARRI have only recently (past 2 years or so) released a 4K sensor.
 
Last edited:
I can imagine 4K being better with an RTS.

It's immediately of use in any game where distance is a factor. Things that are distant may not be drawn on a 1080p monitor but be drawn - and thus visible on a 4k monitor. Then there is the level of detail. Take Cities: Skylines where small objects are not drawn on a 1080p monitor but are on a 4k monitor.
 
Time will come people will be on 8K. Won’t be anytime soon though.

I got my 4K monitor in 2014 and apart from a few, everyone would say it is a waste and there was no difference in image quality bla bla. Yet the past couple of years people are changing their tune. There was a huge difference to my eyes between 1440p and 4K. Only way one does not see it is if they sit far away from their monitors (apparently quite a few do) or they have something in common with Stevie Wonder :p

I recently dropped from 4K to 1440p ultrawide and could immediately see the difference. However having QD-OLED and proper HDR more than made up for that.
 
Just a few years ago, tech media: 4k gaming is stupid

2022 tech media: Graphics cards are too fast, dont upgrade unless you play games at 4k
 
Last edited:
I mean I already made the logical conclusions that most people are not benefitting from 4K unless playing games due to the real time benefit of rendered graphics, 8K is just completely stupid, end of.
In all fairness we need software to truly push graphics but we are mostly on console ports dragging XBOX ONE and PS4 with them so we can;t have a true next gen game.
 
Last edited:
144 Hz 1440p OLED (E.g. the Alienware Ultrawide) > 8K 60 LCD for me

Some people get caught up in numbers, but once you get good enough, it doesn't really matter.


For example, not sure I'd be fussed about 8K over 4K for films. Nor would I care for a 2000 Nit OLED over a 1000 nit. Or 240 Hz over 144 Hz.
 
8k wont be a real thing for ages because there is nothing really running at 8k. Ive played 8k for a brief time with the 4090 on a 8k Lg for a laugh, same with p5 and series x. I did not see it as worth it since a lot of games barely run native and even with the current gpu gen all it does is push the limits and keep it super hot. when you are on a gaming monitor most people are close at a desk so there is just no need for it even 1080p games look good at that range.
 
1440p is good enough that i am not bothered about getting better resolution anymore.

the thing i want mostly now is well priced OLED monitors in smaller sizes, not 4K or 8K gaming
 
Back
Top Bottom