Looking for an Ultrawide 4K Monitor. Is it Even a thing?

And you’re the one with no evidence to back up your position? You’re out of the discussion, because you have no evidence to back up your position.
And you're out of the discussion because your evidence has been "hurp durp science" (with no references), whilst simultaneously disallowing numerous different people posting their own actual experiences.

Simple test for you would be connect a tv to your pc and run it at 24hz. Scroll a web page up and down it - or even just move your mouse cursor, if you can't see a difference between that and 60hz then there is something wrong with your eyes or perception.
 
Not at all, but if there really was this incredible improvement, why is the cinema generally still 24Hz?

If all we can perceive is 24hz, why is motion interpolation/smooth motion a thing on TVs, and why is it noticeable?

I will post up a few links to motion/blur studies.

Still waiting...

I'll happily do a blind test on my monitor if you want to come and randomly switch it between 60/175hz. If I don't get 10/10 right then I'll pay your travel costs
 
Last edited:
And you're out of the discussion because your evidence has been "hurp durp science" (with no references), whilst simultaneously disallowing numerous different people posting their own actual experiences.

Simple test for you would be connect a tv to your pc and run it at 24hz. Scroll a web page up and down it - or even just move your mouse cursor, if you can't see a difference between that and 60hz then there is something wrong with your eyes or perception.
Actually, I’ve been researching your questions, which I was happy to admit I was unable to answer.

60Hz is used on TVs and monitors because 24Hz movies show a complete image in front of the projector (even digital ones) and at 24Hz you see continuous motion whereas a TV or monitor has to draw the image one line at a time and 60Hz is the time it takes for the complete image to be drawn to give you the equivalent of 24 complete frames per second so 60Hz on a TV or monitor IS the same as a 24Hz movie projector.

I‘m 100% not into “hurp durp science” and if you Google “Persistence of Vision” you’ll get the start of why movies work at 24Hz. And I‘ve been trying to find some papers that aren’t behind Elsevier‘s paywall, which I obviously can’t share unless you happen to be a student (I’m a Reader at Leeds). The issue most researchers are working on is whether the image sits on your vision because of the physiology of the eye or because your brain cannot keep up with the images processed. 24-30 frames per second isn’t contested though and if you google ‘why are movies shot at 24 frames per second’ you’ll find plenty of evidence.

At the same time I’ve been looking at what the best graphics card I have (3080) will produce in terms of frames per second vs. the refresh rate of the monitor as let’s say it produces 60 frames per second then with a 60Hz monitor the two would not be synchronised together if 60Hz were 24 frames per second, you would need to be refreshing at 180Hz to show 60 frames per second which I’m still not convinced you could actually see because of persistence of vision and the whole 24Hz = smooth motion thing.

So just because you’ve given up trying to work out if it matters, I haven’t. And why would I tell you any of that if I wasn’t genuinely interested in trying to figure out whether ultra-high refresh rates were useful or not?
 
If all we can perceive is 24hz, why is motion interpolation/smooth motion a thing on TVs, and why is it noticeable?



Still waiting...

I'll happily do a blind test on my monitor if you want to come and randomly switch it between 60/175hz. If I don't get 10/10 right then I'll pay your travel costs
Unlike some, who will happily hurl self-righteous insults about, I actually want to know why. So I’ve been researching it. The difficulty with linking scientific papers is that most journals are behind paywalls so it’s not feasible but if you see my post above you will read why 60Hz on a monitor is the same as 24Hz at the movie theatre.

That still doesn’t explain why you perceive a smoother image at 120Hz which is basically 48 frames per second and you shouldn’t be able to see.

And just showing the monitor wouldn’t be statistically significant. We’d need three monitors, one running at the test frequency and two running at another frequency and you’d need to pull the ‘right’ monitor out at least 9 times to give anything like a statistically interesting result.
 
60Hz is used on TVs and monitors because 24Hz movies show a complete image in front of the projector (even digital ones) and at 24Hz you see continuous motion whereas a TV or monitor has to draw the image one line at a time and 60Hz is the time it takes for the complete image to be drawn to give you the equivalent of 24 complete frames per second so 60Hz on a TV or monitor IS the same as a 24Hz movie projector.
60hz on TV is not the same as 24hz cinema at all. Watch EastEnders and compare the fluidity of the motion to anything you experience in the cinema. "Soap opera effect" is an actual thing
 
You asked why 60Hz and that’s the explanation. Sorry if you don’t like the answer.
 
And just showing the monitor wouldn’t be statistically significant. We’d need three monitors, one running at the test frequency and two running at another frequency and you’d need to pull the ‘right’ monitor out at least 9 times to give anything like a statistically interesting result.

If you want to buy me another 2 Alienware monitors then I'm in in :)
 
So given that your eyes/brain are fooled into thinking something is moving at 20fps (20Hz) which is why movies are filmed at 24Hz and most graphics cards can’t do more than 90fps at 2160P how do you explain the benefit while scrolling text?

I‘m not convinced a human can make use of a high refresh rate monitor, simply because the eye/brain can’t keep up.

Consider your reaction time. 0.2s is considered pretty decent. That’s 5 times per second or 5Hz.
Reaction time has nothing to do with ability to perceive something.

Human vision could pick up high contrast content changes up to hundreds of Hz.
That's why monitors with backlight PWM control are bad for comfort, unless frequency is at kHz level.

And guess how long photographic flash is in duration? Like 2ms at longest and normally 1ms to 0,1ms.
Yet we see those annoyingly just fine.
And typical stroke of lightning is just one tent of that.


And movies are still filmed at slide show speed, because there are people with better embalmed brains than Egyptian mummies doing decisions.
Only reason for 24Hz was compromising to minimize costs of expensive film stock and still maintain some kind resemblance of motion.
If those people who had to do that compromise saw current technology and still sticking to that archean era limit, they would no doubt be mad at those insisting keeping 24fps.
 
There is a VAST amount of scientific literature regarding motion, blur and How your eyes and brain process data. And none of it would suggest any benefit beyond about 30Hz.
That's Trump and Putin level BS.

To handle motion reasonably for especially sports, TV broadcasting standards were precisely made with image rates significantly above Hollywood's slide show.
50/60Hz difference simply came from need to have image rate synced with electricity, because otherwise artificial lightning would have caused flickering of the image.
Just like how (American standard based) 60Hz CRT monitors visible in PAL TV broadcasts showed flickering image.

And those 60Hz CRTs were plain horrible monitors, because of visible flickering in anything above "post card" size.
And flicker of even those small screens was easy to see, if monitor was outside center of vision:
Center of eye's FOV is focused to maximizing spatial resolution, but going outside it situation moves toward higher temporal resolution.

Really 75Hz was any kind minimum refresh rate for CRT monitors to not be incomfortable because of flicker and I run my last 19" CRT at 85Hz.
LCDs again hold image static until next refresh avoiding brightness flicker and hence manufacturers started by sticking to old 60Hz.
(to lower signal bandwidth and processing power needs and liquid crystals would have been just too slow for faster)


I don’t know about 48Hz movies, but you never really hear anyone complain that the cinema is jerky or blurry, do you?

Given that most people work on 30-60Hz monitors and have no complaints.
Because criticism is always silenced by Earth is the center of the universe Church Inquisition like you defending outdated technological compromises!

I've always said 24Hz belongs to museum of medieval history.
It's simply completely incapable to any kind faster than crawling motion over bigger portion of image.
Watching starting scene of the Day After Tomorrow in movie theaters was headache causing experience, because of extremely jerky motion.
And drunkard's vomit amount of blur is hardly any better.

There has been no below 60Hz monitors in modern home PC era!
Which just shows your ignorance of facts.
 
All I can tell you that over the last 15 years, going from 60Hz, to 120hz was a revelation, one I used for 8-10 years.

Whippersnapper! :) I've seen it. Twice over. Sit down and let grandpa Quartz ramble. :)

I started graphical gaming in the (very) late 70s on Apple computers on CRT monitors and I saw the improvement in resolution and refresh rates there. I was one of those sensitive to 60 Hz CRT refresh rates so went for 72 Hz ASAP and always spent heavily on monitors, eventually getting the 22" Iiyama CRT. 100 Hz on that was gorgeous. Like glass, I recall a reporter saying. I have a vague recollection of running VGA at 200 Hz BICBW. Flat panel displays are completely different. Refresh rates on LCDs are very different to the eye and I very quickly dropped my 22" CRT for a pair of 1280x1024 60 Hz LCDs. I've since seen refresh rates and resolutions climb and, having broken my 100 Hz 1440pUW I now game at 4k 120 Hz. I cannot notice the difference between 100 Hz and 120 Hz on a flat panel display. And older games will easily break 120 fps [1]. And Command & Conquer 3 was locked at 30 fps and was totally smooth. I do notice the difference between 60 Hz and 120 Hz in many genres, although I could probably only tell you that one had a refresh rate higher than the other. The thing is, I'm old. I'm not a teenage gaming god. Nor am I a professional gamer. My eyes are not what they were. High refresh rates aren't for everyone. So some tolerance, please.

This too might interest you:


[1] I'm principally thinking of the original Far Cry which I still play but also games like Freespace 2, released in 1999 and still going strong, though if you mod Freespace 2 to the max you may get sub 30 fps on a 4090 in the heaviest scenes; there's only so much a single-threaded game can do.
 
Currently, I have an Acer Predator X34 that I bought circa 2016. It's served me well, still does in some respects, but I've been looking to upgrade for a couple of years, ever since I bought my 3090 graphics card.

Limitations of this monitor are:
-Limited to 100Hz refresh
-Not 4K
-Panel tech maybe dated
-Not HDR
-Power LED failing (maybe linked to a bigger issue)


I would feel like if I upgraded, I should go 4K. Not just for gaming, but also watching Netflix 4K and movies. However, I love the Ultrawide aspect, I probably would miss it if I reverted to a normal aspect ratio.

Is there any 4K(at least) ultrawide monitor out there that can satisfy my urge? Should I wait for upcoming tech? The top end of my budget would be about £2500

Cheers.

There is no ultrawide 4k monitors on the market. But there is one coming later this year called Samsung GN95C
 
Last edited:
Because the term 4K ultrawide doesn't make any sense :confused:

4K literally means ~4000 pixels across, but is typically used for 3840x2160, a 16:9 aspect ratio.

Ultrawide also is a horrible term in itself, in that manufacturers variously describe lots of different resolutions as ultrawide

2560x1080 (2.76 MP)
3840x1080 (4.1 MP)
3440x1440 (4.95 MP)
3840x1600 (6.1 MP)
5120x1440 (7.37 MP)
5120x2160 (11.06 MP)

You'll need to be more specific about what resolution or aspect ratio you are actually looking for


If you want the same vertical resolution as typical 4K screens (2160), but better horizontal resolution, then you'd be looking at something with 5120x2160 (variously known as UW5K or WUHD)

This guy gets it. No point in looking for a “4K ultrawide” if such a thing isn’t really possible; given our understanding of the terms.
 
This guy gets it. No point in looking for a “4K ultrawide” if such a thing isn’t really possible; given our understanding of the terms.

Personally I’d use the term “4k ultrawide” to refer to a 21:9 monitor with a vertical resolution of 2160 pixels hence capable of displaying 4k content without the need for scaling.
 
Last edited:
Personally I’d use the term “4k ultrawide” to refer to a 21:9 monitor with a vertical resolution of 2160 pixels hence capable of displaying 4k content without the need for scaling.
It’s a reasonable interpretation; but not a universally accepted one.
 
It’s a reasonable interpretation; but not a universally accepted one.

It does logically follow the naming conventions of other existing ultrawide resolutions:

1080p ultrawide = 2560x1080
1440p ultrawide = 3440x1440
4k ultrawide = 5120x2160

However given there isn't even a universally accepted definition of 4K, I think the chances of getting a universally accepted definition of 4K ultrawide are about the same as me winning the lottery! (I don't play the lottery...)
 
Back
Top Bottom