***21.9 Ultrawide Thread***

Could always use CCR 14 day return policy ;) :p

Thanks, sounds like I'm missing a trick although I did only try a couple of 1080p starwars movies. I think they might have been 16:9 and had I it stretched to fill the screen...

Haven't got any of the old star wars films so can't check.

But if you follow my brief guide and links in op, films shot in 16.9 format should not stretch nor zoom in.
 
It is the law, you can return it for whatever reason you want as long as it hasn't been damaged and is within 14 days.

You will have to cover the shipping costs though.

Could always contact ocuk and just explain to them that it will cost more for both you and them and be more faff required, they might agree just to refund you the difference, a certain rainforest jungle does this.
 
Not really, I find that most games have the UI/HUD still in the 16.9 area i.e.

VRLAF6jh.png.jpg

Although I would much rather be able to move that stuff as far in to the corners as possible like you can in the division.
 
What about desktop use, would he notice much of a difference there?

Yep definitely.

Much less screen real estate and pixels will be rather noticeable with text, although again, the latter largely depends on how far you sit to the screen.

I may then go for a 1080p model and save some pennies :)

If you can, I would try to wait until late Summer, we should be seeing some more 21.9 screens around then, not to mention, see what happens with free/g sync versions.
 
There is a big difference between 3440x1440 and 2560x1080 at 34". So Gsync is pointless (other thread) and also anything above 1080p is pointless as well???

Did you even bother to read my post "entirely" or the post that I linked to?

The guy has a 3440x1440 34" screen.... and there are plenty of others who have said the same... Understand how PPI and viewing distance works before commenting please.

Also, link me to the post where I said "gsync was pointless" as well.
 
Last edited:
If you read my posts entirely you will see that there is more to it than just "no difference at all, it is pointless"

As above, there isn't really much, if any noticeable difference between 3440x1440 and 2560x1080 @34" in games, especially if you sit further back:

https://hardforum.com/threads/34-21...0-aoc-u3477pqu.1827296/page-2#post-1041126330

And as the link shows as well as posts like mrk's:

You can easily run 2560x1080 on a 34" 1440 UWA. I run Division at 2560x1080 and cannot tell the difference other than 15fps more performance on Ultra settings.

There are plenty of people with 34" 21.9 monitors who find the same.

Of course there will be a difference, however, unless you are sitting pretty close (and with a 34" 21.9 screen, I would like to think people are sitting a suitable distance away from them to really appreciate the benefit of these 34" 21.9 screens), you are going to be hard pressed to notice a difference in media consumption, like I said earlier, with any text content, you most definitely will notice the difference though.

The further you move back from the display, the sharper/clearer it will look, I have a few really crap old quality TV shows and sitting at my normal distance (about an arms length away), they look awful and pixelated but as you get further back (about 2 arms length away), they begin to look a lot clearer and better.

As for the gsync, I have said that g/free sync are great techs, however, it is not the silver bullet that some make it out to be as low FPS (<50fps) is still low FPS, yes, it is better to have g/free sync at those low FPS than no g/free sync but if people are wanting to get the full benefit of 144HZ, they need to be pushing those FPS too. Most of my posts in that thread are more about the ones saying that 60HZ and no sync tech are useless when that simply isn't true "if" people take the time to tweak and test some settings, FPS limiters and vsync options out.
 
Last edited:
Another TV show where they show of 21.9 monitors! :D That is 3 TV shows now where 21.9 monitors have been shown of a bit, legends of tomorrow, heroes reborn (the bad woman has one as her office monitor) and arrow (Felicity has about 3 separate setups each with 3 dell 34" monitors)

jSIhJsf.png
 
Last edited:
I'm not sure why Nvidia would support adaptive sync. The gsync solution is clearly better, it just works, and has a better range.

Except it is not... as PCM2 has said:

No. ;) They are both equally as effective at eliminating tearing, juddering and stuttering from the traditional refresh rate and frame rate mismatches. I've seen some fairly good pixel overdrive implementations for both FreeSync and G-SYNC models and also some not so good ones. And if there are any latency differences they're certainly beyond my sensitivity for that sort of thing.

And plenty of other tests/feedback from "non-bias" people/reviewers that say pretty much the exact same.

The freesync range depends on the monitor used and either way, according to PCM2, it really doesn't matter since AMD have LFC and as I have said many times before you really don't want to be hitting any lower than 40 fps max anyway, g/free sync is not a silver bullet like what many make it out to be.

Low frame rates are low frame rates regardless. You really don't want to be getting under 56fps or anywhere near that on a 144Hz monitor, it feels and looks extremely sluggish. And besides, now that AMD has LFC the frame rates below the hardware floor (56Hz/56fps) are suitably compensated for to remove stuttering and tearing. It's very much a non-issue really.

Either way adaptive/free sync, whatever you want to call it is not going anywhere, not when it is an open standard and will be used by intel in the future as well as HDMI having adaptive sync. The only way nvidia would ever be able to avoid adaptive sync is to not include anything higher than DP 1.2 in their future GPUs (which means cutting of a lot of future high end monitors i.e. 4k with 100+HZ), paying monitor manufacturers to disable it on the firmware side (unlikely especially when intel will have support) or disabling it via their own drivers (no doubt some kid will write a program to enable it though)

Gsync will die, it is only a matter of time, I can see it becoming a niche product and only used on the top end monitors, it will just end up the same way that nvidia 3d vision went.
 
Last edited:
Indeed :p

BTW, I would avoid the current 35" 2560x1080 VA monitors, they have awful motion, not to mention the PPI would be even worse than 34" 1080.
 
Thats what I figured. I really want 1440p as I need to be able to look at text when coding (when I get chance).

I'm getting more and more round to the thought of a better GPU (390x) and a freesync monitor and give the 970 to the GF for 1080p gaming.

Personally I would wait a week or 2 to see what is what with pascal and polaris, not to mention, there are bound to be price drops on the current lot of GPUs too.

Although still, even if pascal is better "overall", the price difference between the free and g sync XR34 is pretty damn large and the gsync version is most certainly not worth the extra imo considering the only difference will be better motion clarity (which comes at the cost of possibly coil whine and scanlines, which can be somewhat fixed but it requires a bit of faff) and to get that benefit, you will need to be pushing 75+ FPS, which is pretty much a no go in most new AAA games with all the current single GPU's unless you are ok with dropping settings. Also, I wouldn't be surprised in the slightest if the freesync drops to £500 relatively soon too...
 
Last edited:
Yup I completely agree, the price difference is ridiculous, just think what else you could buy with £350... not to mention the extra money that would be saved with getting an AMD GPU too.
 
Nope haven't used one, no need to either as that is what reviews are for ;) :p

PCmonitors.info and TFTcentral mentioned about the motion issues as well as some users on reddit, motion clarity is a very personal thing, some are sensitive to it where as others aren't.

The z35 and the aoc equivalent certainly isn't bad "overall" for motion clarity but there are a few problematic areas with certain transitions, I should have probably explained that better in my previous post.

And yup 2560x1080 @ 35" for gaming will be fine in terms clarity/sharpness especially if you are sitting pretty far back, it is just text stuff where it will be rather unpleasant. I would also take 2560x1080 over 3440x1440 for gaming, give me higher graphic settings with higher FPS over lower FPS & lower graphic settings @ 1440P
 
Wait until polaris, it is "probably" unlikely that AMD will offer anything better than nvidia but their cards shouldn't be much slower than nvidia's i.e. within 5-10% and chances are they could be even cheaper, if anything, it could drop the prices of the 1070/1080 even further.... Besides, I wouldn't be surprised if we see some stock issues with pascal and if so, etailers will take the **** with the pricing...

The 1080/1070 isn't quite as ground breaking as what nvidia's marketing would have people believe.

Standard 1080 will max out everything out today on 3440x1440 easily. A single 980 can max most games with some FSAA for starters.

Depends what FPS you are happy with though.

Most new AAA games on max settings are very demanding, even a heavily OC 980ti can't maintain a constant 50FPS for the majority of the time i.e. the division, fallout 4, the witcher 3 & rise of the tomb raider being the main performance hogs.

I believe people are estimating that the 1080 will be about 10-20% faster than an OC 980ti. Of course the 1080 could be a good overclocker too but we will have to wait and see....
 
Last edited:
VA does not have "perfect" blacks, the only screen tech. to have "perfect" blacks is OLED. However, VA is a **** load better than TN & IPS for blacks and contrast ratio (which does make a substantial difference with dark content)

IPS is probably the best "overall" panel type atm, the only real downside with it can be IPS glow (depends on your brightness + room lighting + angle that you view the screen from)

TN is the fastest for "overall" response, in other words, usually a bit better for motion clarity.

IMO, unless you are really into your fast paced FPS shooters i.e. CS GO or motion clarity really bothers you, you can't beat a good quality IPS or even VA monitor (only VA screen I would look at would be the Samsung 34" 1440 one though...)
 
Depends on how you look at it, neither one is necessarily "better" than the other, when it comes to the main goal i.e. smoothing out frame dips, removing screen tearing without input lag, they are both pretty much the same as has been shown by badass's and PCM2's posts + reputable fair reviews/comparisons i.e. anandtech.

Main differences come down to freesync monitors having a lot more ports and gsync working with various windowed settings (I imagine AMD will get this implemented at some point)

The reasons why gsync costs more is because:

1. extra hardware required and sold by nvidia
2. demand (A LOT more people with nvidia GPUs than AMD so naturally there is more demand thus prices can be a bit higher)

One of the main reasons nvidia went with the gsync module is because none of their current desktop GPUs on the market have the hardware required to make use of adaptive/free sync.

Thanks for the info, Purgatory. :)

On one hand, I really like the idea of a 21:9 monitor, where as on the other I would also like (I think) to try g-sync and get something that definitely can reach north of 60hz. As curved goes, I guess I'm not as bothered - though it would be nice. It seems that as far as combining all of those criteria, we are somewhat limited at the moment by what 21:9 has to offer. I guess it's simply in its infancy when compared to the other types of monitors available.

Does 75hz make a worthwhile difference over 60?

Is Iiyama a budget brand, then?

Somewhat a budget brand but still a very good brand like AOC.

I would probably go with AOC purely for the better build etc. and I believe AOC have better CS too...

And yes 75HZ provides a decent improvement over 60HZ, however, to get the full benefit, you also need to be hitting 75 FPS too.

For not much more money, you could get the acer freesync 34" screen which would definitely be a much better choice if you ever intend on going AMD.
 
Last edited:
Back
Top Bottom