Looking for an Ultrawide 4K Monitor. Is it Even a thing?

I had the same monitor as you since late 2015 and finally upgraded last year for much the same reasons you are looking.

I spent a long time looking at the Ultrawide options but in the end I went for the Asus PG42UQ 42" 4K OLED and I couldn't be happier.

The way I look at it I now have a screen that's slight wider than the UW so I don't feel I've lost anything from the view I was so used to but I've gained some extra height into the bargain.
 
I think the physical dimensions should also be factored in as well as resolution in these discussions. I have two 2560x1440p monitors but one is 27" and the other 32", makes quite a difference.
 
Yeah the only new monitor on way is the 57" Neo G9 that I think covers it all.


it is
  • 7680x2160 so 2 4k monitors basically stuck together to make ultrawide
  • 32:9 aspect ratio
  • 240Hz
  • 1000r curve
  • mini-led
  • HDMI 2.1 because you need the bandwidth to have the high resolution and Hz together (so technically AMD only GPU support that but are not powerful enough unless you playing racing games and COD/CS:GO etc, the 4090 likely stretches it legs but realistically still only going to be getting like 90FPS in most games at that resolution if newer games).
  • Expect it to be more than £2.5k

So the reason there are not huge amounts of monitors is we don't have the GPU grunt to really use them yet.
 
I had an Acer Predator X34 and loved he aspect ratio. I moved to an LG 34" Ultrawide with IPS 60hz panel and the colors were so much better, it was not even close and made me realise how bad the Acer actually was. The X34 was a very early panel and the tech has really moved on.

I also wanted to get a 5kx2k 40" gaming panel but the only options available are the 60hz IPS LG panels and I wanted to go OLED so I got the Alienware AW34DWF. Now this is a really nice panel and it is perfect in everyway for me, I was also worried about the text clarity but I just do not understand wtf they were going on about because it is just great. I do have old eyes though so young whippersnappers may be able to tell the difference but it is really crisp text imho

There are no 5kx2k OLEDs coming to the market this year, or not been anounced anyway and not even rumours of them. What there will be available is the Neo G9 that is Mini-led so inferior tech imho or the 45" 3440x1440 LG OLEDs, Corsair LG Philips all bringing out monitors with this panel iirc. I would wait for reviews of the 45" LG OLED panels and then make your mind up, with it being 45" the pixel density will be significantly lower than the same res on the 34" Alienware.

EDIT- I forgot the Odyssey OLED G9 from Samsung. Going to be 5120x1440 iirc so not the 2k vertical pixels I was after which is why I ruled it out of my choices and went for the AW.
 
Last edited:
Don’t insult me, educate me. If you have something that explains why higher refresh rates are better then please link it and I’ll read it. Just calling me a ‘clown’ doesn’t help anyone. Just because a sponsored YouTuber or a marketing department tells you it’s great doesn’t make it great in real life.

To be honest, I think it's one of those things you need to actually see in person to understand it. Just because you can't react faster than 10Hz, doesn't mean it's not possible to perceive the difference.

I was skeptical at first, coming from a 60hz monitor, and then a 100hz slow VA monitor, but going to the AW3423DW at 175Hz there's a huge & very noticeable difference between 60hz and 175hz. Obviously you get diminishing returns the higher you get (I've just tested quickly moving windows on the desktop, and above 120hz the difference is minimal - it's there, but nowhere near as pronounced as dropping to 60hz).
 
Last edited:
Don’t insult me, educate me. If you have something that explains why higher refresh rates are better then please link it and I’ll read it. Just calling me a ‘clown’ doesn’t help anyone. Just because a sponsored YouTuber or a marketing department tells you it’s great doesn’t make it great in real life.
Dude, get a grip. I do not need to link anything, if you want to learn or get "educated" then go DYOR, or buy a product and experience it for yourself. All I can tell you that over the last 15 years, going from 60Hz, to 120hz was a revelation, one I used for 8-10 years. I then went to 240hz and although the difference was not as "alarming", it was still a significant improvement and highly beneficial. Everything from gaming to scrolling and general browsing is much smoother and more fluid. It does not take rocket science to understand why. The screen is being "refreshed" more often, meaning the image has more "static" pictures filled into the same 1 second compared to a lower refresh rate, meaning your eyes percieve this as a more fluid, natural, smooth motion. If gaming, It means the animations, movement, combat, projectiles and camera movement, all look smoother. But do not mistake this for FPS and input latency. Also, its worth noting that a higher refresh rate has meaningful benefits even for non gaming, as mentioned above it improves your entire desktop experience. But don't take my word for it, go out and buy one, but don't sit here and tell people who have been experiencing these benefits for literally over a decade they are wrong, simply because you don't know any better...
 
Dude, get a grip. I do not need to link anything, if you want to learn or get "educated" then go DYOR, or buy a product and experience it for yourself. All I can tell you that over the last 15 years, going from 60Hz, to 120hz was a revelation, one I used for 8-10 years. I then went to 240hz and although the difference was not as "alarming", it was still a significant improvement and highly beneficial. Everything from gaming to scrolling and general browsing is much smoother and more fluid. It does not take rocket science to understand why. The screen is being "refreshed" more often, meaning the image has more "static" pictures filled into the same 1 second compared to a lower refresh rate, meaning your eyes percieve this as a more fluid, natural, smooth motion. If gaming, It means the animations, movement, combat, projectiles and camera movement, all look smoother. But do not mistake this for FPS and input latency. Also, its worth noting that a higher refresh rate has meaningful benefits even for non gaming, as mentioned above it improves your entire desktop experience. But don't take my word for it, go out and buy one, but don't sit here and tell people who have been experiencing these benefits for literally over a decade they are wrong, simply because you don't know any better...
Dude, you seem very angry that someone has challenged the ‘belief’. I have DMOR and found nothing that supports what you say. You are just repeating the same marketing stuff the advertisements and paid people on YouTube say. Just show me something that demonstrates the human eye/brain can indeed process all those extra refreshes and I’ll be quite happy. The marketing departments have come up with higher refresh as a way to sell you a new monitor. Being honest, if I blind tested you repeatedly over several sets of 3 monitors, one with a significantly higher refresh rate, do you believe you could pick out the one with. The higher refresh rate?

Don’t just believe what the marketing tells you. And make no mistake, most YouTubers are an extension of company’s marketing which is why the best reviewers buy their samples instead of being sent them by the manufacturer. And even if someone does review something they purchased themselves, very often the fact that they bought it after reading the spec sheet will bias them.

There is a VAST amount of scientific literature regarding motion, blur and How your eyes and brain process data. And none of it would suggest any benefit beyond about 30Hz. And I know there are gamers who swear that a monitor with a 1ms response time gives them an advantage but do your own research on hand/eye ço-ordination and just under 100ms is basically as fast as a human can respond to a visual or auditory input and that’s just science, not your lived experience ‘truth’. In your truth you are seeing clothes on a naked emperor. I’m just asking for proof of clothing.
 
Last edited:
Just show me something that demonstrates the human eye/brain can indeed process all those extra refreshes and I’ll be quite happy.
You'd have to use a monitor to be honest, you'll see it straight away.
The biggest difference is when there's some fast motion on the screen, like in a FPS game for example flicking your mouse around, usually it will cause a lot of blur, but with a 144Hz it's reduced significantly. If I move my mouse (POV) around very quickly (in a FPS game) I can tell the difference between 144Hz & 240Hz but it's a very tiny difference.

To be honest even just moving your mouse/windows around there's a huge & noticeable difference between 60Hz & 144Hz.
Everytime I drag a window over to my 4K 60Hz monitors, vs my 240Hz monitors I can see the difference clear as day. It'd be nice to see 75Hz-90Hz as the standard for all monitors to be honest.

It's a lot more satisfying playing without any blurring & you see enemies that you might not see with the jaggedness while moving around, so yes it certainly does make a difference.
 
Last edited:
Can you see the difference in games between 60Hz and 75Hz? Let alone 144Hz plus?

There are no movies that run over 30Hz (most are 24-25Hz) and browsing won’t trouble either of those monitors.
On my work LG monitor there is absolutely a difference between 60 and 75hz, both on games and outside. The difference at that level isn't huge but it's enough that running below 75hz feels a bit more skittish, my 2560x1080 75hz screen is smoother than my 4k 60hz which I kind of regret getting over 1440 wide.

I loaded CSGO one day and it instantly looked and felt weird. Flicking felt more like it was skipping. I alooked through various settings before realising it had gone down to 60hz.

The "human eye" argument will go on forever but there is a huge difference in smoothness above the 100hz range against 60hz, primarily in faster paced environments, and I've never known anyone say there is no difference - on the contrary I know multiple people who blind tested themselves and reliably detected the high refresh rates.

Hell even my phone is obvious to detect battery saver because it forces the display to 60hz...
 
Last edited:
Dude, you seem very angry that someone has challenged the ‘belief’. I have DMOR and found nothing that supports what you say. You are just repeating the same marketing stuff the advertisements and paid people on YouTube say. Just show me something that demonstrates the human eye/brain can indeed process all those extra refreshes and I’ll be quite happy. The marketing departments have come up with higher refresh as a way to sell you a new monitor. Being honest, if I blind tested you repeatedly over several sets of 3 monitors, one with a significantly higher refresh rate, do you believe you could pick out the one with. The higher refresh rate?

Don’t just believe what the marketing tells you. And make no mistake, most YouTubers are an extension of company’s marketing which is why the best reviewers buy their samples instead of being sent them by the manufacturer. And even if someone does review something they purchased themselves, very often the fact that they bought it after reading the spec sheet will bias them.

There is a VAST amount of scientific literature regarding motion, blur and How your eyes and brain process data. And none of it would suggest any benefit beyond about 30Hz. And I know there are gamers who swear that a monitor with a 1ms response time gives them an advantage but do your own research on hand/eye ço-ordination and just under 100ms is basically as fast as a human can respond to a visual or auditory input and that’s just science, not your lived experience ‘truth’. In your truth you are seeing clothes on a naked emperor. I’m just asking for proof of clothing.

My guess is that you've already made up your mind, and anything anyone provides that proves a difference is going to be instantly dismissed as "marketing", regardless of how scientific/independent it is.

You're also conflating image recognition and processing/response with perceived smoothness.

I'd like a link to one of these studies which shows we can't see the difference any higher than 30hz, because I imagine 90% of people on this forum would disagree :rolleyes:
 
Last edited:
We're way off topic here but I'll bite.

If there was no difference above 30hz, then why did so many people complain about "soap opera effect" with the high frame rate showings of The Hobbit and other movies? (Personally thought it was great and loved the smoothness of 48fps cinema)

Why is 60hz the standard when manufacturers could save money on lesser quality components to refresh less frequently?

As for other personal experience, I tried 4k monitors when they first came out - most graphics cards at the time didn't have a display connection that could drive them at 60hz, so they ran at 30. Compared to other monitors in the office the difference was obvious, even with basic things like scrolling web pages, the limited refresh rate made things appear far jerkier.

I've got several different monitors in our house, 60, 75, 144 and now a 165hz. The difference between 60 and 75 is arguably more pronounced than 75 to 144, and certainly the 144 to 165 jump is imperceivable to my eyes.
 
You'd have to use a monitor to be honest, you'll see it straight away.
The biggest difference is when there's some fast motion on the screen, like in a FPS game for example flicking your mouse around, usually it will cause a lot of blur, but with a 144Hz it's reduced significantly. If I move my mouse (POV) around very quickly (in a FPS game) I can tell the difference between 144Hz & 240Hz but it's a very tiny difference.

To be honest even just moving your mouse/windows around there's a huge & noticeable difference between 60Hz & 144Hz.
Everytime I drag a window over to my 4K 60Hz monitors, vs my 240Hz monitors I can see the difference clear as day. It'd be nice to see 75Hz-90Hz as the standard for all monitors to be honest.

It's a lot more satisfying playing without any blurring & you see enemies that you might not see with the jaggedness while moving around, so yes it certainly does make a difference.

I think the challenge is, I’ve got a wide variety of monitors here (I think the fastest refresh is 165Hz) and there is no real difference between them. And I don’t buy crap monitors. I’ve got Viewsonic, Iiyama, LG, Lenovo and Dell and I’m sorry, but I don’t see it.

And while I have no doubt you believe you can see a difference, without doing a replicated triangle test it‘s hard to say if a placebo or selection bias effect.

The reason I raised this in the first place was that the OP rejected very good monitors that met their criteria except they were 60/75Hz so clearly to the OP they were excrement.

Have you noticed how the increasing refresh rates are basically multiples of 24 or 30? I wonder why that is?
 
We're way off topic here but I'll bite.

If there was no difference above 30hz, then why did so many people complain about "soap opera effect" with the high frame rate showings of The Hobbit and other movies? (Personally thought it was great and loved the smoothness of 48fps cinema)

Why is 60hz the standard when manufacturers could save money on lesser quality components to refresh less frequently?

As for other personal experience, I tried 4k monitors when they first came out - most graphics cards at the time didn't have a display connection that could drive them at 60hz, so they ran at 30. Compared to other monitors in the office the difference was obvious, even with basic things like scrolling web pages, the limited refresh rate made things appear far jerkier.

I've got several different monitors in our house, 60, 75, 144 and now a 165hz. The difference between 60 and 75 is arguably more pronounced than 75 to 144, and certainly the 144 to 165 jump is imperceivable to my eyes.

I don’t know about 48Hz movies, but you never really hear anyone complain that the cinema is jerky or blurry, do you? And they are almost uniformly 24Hz. In every scientific study I can find, the test subjects stop seeing any improvement in smoothness at between 24 and 30Hz. It’s just what the research shows.

Why 60Hz is the standard is again unknown to me, I simply challenged someone who said that anything under 144Hz wasn‘t worth having. Given that most people work on 30-60Hz monitors and have no complaints. Brightness I can understand, contrast makes sense, OLED blacks etc. I just can’t find anything that backs up 6-10x oversampled screens as being detectable.
 
My guess is that you've already made up your mind, and anything anyone provides that proves a difference is going to be instantly dismissed as "marketing", regardless of how scientific/independent it is.

You're also conflating image recognition and processing/response with perceived smoothness.

I'd like a link to one of these studies which shows we can't see the difference any higher than 30hz, because I imagine 90% of people on this forum would disagree :rolleyes:
Not at all, but if there really was this incredible improvement, why is the cinema generally still 24Hz?

And quoting population statistics simply backs up the “You can fool all of the people some of the time and simeeof the people all of the time” response. You know they banned cigarette advertising because they managed to convince some people to smoke? Marketing is a powerful thing. And once you believe it’s really hard to admit you were conned, because…. Nobody likes to be conned.

I will post up a few links to motion/blur studies.
 
Dude, you seem very angry that someone has challenged the ‘belief’. I have DMOR and found nothing that supports what you say. You are just repeating the same marketing stuff the advertisements and paid people on YouTube say. Just show me something that demonstrates the human eye/brain can indeed process all those extra refreshes and I’ll be quite happy. The marketing departments have come up with higher refresh as a way to sell you a new monitor. Being honest, if I blind tested you repeatedly over several sets of 3 monitors, one with a significantly higher refresh rate, do you believe you could pick out the one with. The higher refresh rate?

Don’t just believe what the marketing tells you. And make no mistake, most YouTubers are an extension of company’s marketing which is why the best reviewers buy their samples instead of being sent them by the manufacturer. And even if someone does review something they purchased themselves, very often the fact that they bought it after reading the spec sheet will bias them.

There is a VAST amount of scientific literature regarding motion, blur and How your eyes and brain process data. And none of it would suggest any benefit beyond about 30Hz. And I know there are gamers who swear that a monitor with a 1ms response time gives them an advantage but do your own research on hand/eye ço-ordination and just under 100ms is basically as fast as a human can respond to a visual or auditory input and that’s just science, not your lived experience ‘truth’. In your truth you are seeing clothes on a naked emperor. I’m just asking for proof of clothing.
OK, now I know you are just a clown AND a troll. I won't discuss this with you anymore, I will just state a simple fact, you are wrong. It is not a matter of "opinion" you are just WRONG. End of story. Now go out into the world and figure out why for youself, because you seem to have some very basic failings in undertanding how the brain and eyes percieve things and confuse all of it. Im out, no more replies to you, good luck with your ingorance and stupidity.
 
I think the challenge is, I’ve got a wide variety of monitors here (I think the fastest refresh is 165Hz) and there is no real difference between them.
Have you checked that you actually have your monitor set to 165Hz?
Move your mouse across your 60Hz & your 165Hz, if you can't see it, no offense but maybe your eyesight is just going. It's clear as day.
It's nothing like placebo or anything you describe, you can CLEARLY see the difference.
 
30Hz is horrendous for mouse movement even on the desktop and I dont like the jarring nature of 24fps/30fps video content in scenes where camera pans. Even moving the mouse at 60Hz is immediately noticeable on the desktop if Windows changes it for any reason.

Personally I wouldn't mind trying out some 240Hz screens and seeing if I can tell the difference between the 120/144 I have used.
If you cant see the difference here then...

 
Personally I wouldn't mind trying out some 240Hz screens and seeing if I can tell the difference between the 120/144 I have used.
I can see the difference between 120 and 240 there, the 240Hz has less blur, the alien/stars & the red thing are a lot more visible, there's almost no blur left.
 
I don’t know about 48Hz movies, but you never really hear anyone complain that the cinema is jerky or blurry, do you? And they are almost uniformly 24Hz.
It's one of those things, that unless you've actually seen it in person you can't quantify how different it is (and indeed some people don't like it at all).
As for people not complaining, then it's difficult for people complain if they aren't aware that there is anything different available (and it's only rarely been available at the cinema)


People can notice the difference - one of the Christmas episodes of EastEnders was shot at 24fps a couple of years back (for dramatic effect I think), my wife instantly noticed that it was different, as did quite a few people on social media.
 
OK, now I know you are just a clown AND a troll. I won't discuss this with you anymore, I will just state a simple fact, you are wrong. It is not a matter of "opinion" you are just WRONG. End of story. Now go out into the world and figure out why for youself, because you seem to have some very basic failings in undertanding how the brain and eyes percieve things and confuse all of it. Im out, no more replies to you, good luck with your ingorance and stupidity.
So I’m a clown, a troll, ignorant and stupid? And you’re the one with no evidence to back up your position? You’re out of the discussion, because you have no evidence to back up your position. Questioning something isn’t trolling. Asking for evidence of a stated ‘fact’ isn’t trolling.

In Prince Harry’s autobiography he states several things that are patently untrue. When challenged with evidence they were untrue he responded that he believed them to be true, and that was how he remembered them, so they were true to him. I think you believe you can see a difference. Even if I show you evidence you can’t, you’ll still believe that.

You‘re actually out of the discussion because you have nothing beyond insults to contribute.
 
Back
Top Bottom