***21.9 Ultrawide Thread***

Sweet! Good to hear about Generals. You've got to think CS would be able to!? Anybody prepared to give it a whirl for me? Pretty please!? :)
 
My screen shot a few posts back of F.E.A.R which was released in 2005 running at 21:9 natively does demonstrate that some old games will run ok.

In fact the only games I haven't been able to get to run at 21:9 are Resident Evil 5 & 6 and Southpark The Stick of Truth. Here's a list of the games I've had running at 21:9 since I got my Acer X34 2 months ago, either natively or with a simple fix that can be found for each game on the Steam forums:

Rise of The Tomb Raider
Just Cause 3
Hitman
Alien Isolation
Aliens vs Predator
Assassins Creed 2
Bioshock 1, 2 and Infinite
Borderlands 2
Call of Duty Ghosts
Call of Juarez Gunslinger
Darksiders 2
Darksiders 2: Deathfinitive Edition
Dead Island
Deus Ex: Human Revolution - Directors Cut
Devil May Cry
F.E.A.R
F.E.A.R 3
Fallout 4
Far Cry 4
Grid 2
Grid Autosport
Lucius
Metro Last Light Redux
Painkiller: Hell & Damnation
Payday 2
Prototype 2
RAGE
Serious Sam 3
The Witcher 2
Wolfenstien: The New Order.
 
Any news on the upcoming Acer Z1 21:9 version, specifically price and release?

It won't be for everyone, but I think it'll be my first foray into ultrawide gaming. It's only 1080P, but at 29", the PPI and vertical height will be fairly similar to my current 23" 1080P, but with extra width (ooohhh matron), G Sync and high refresh rates. Plus it'll be a better overall panel (I hope) than my 4/5 year old Samsung's TN.

Main reason for considering it is cost of the unit itself vs 1440P UWS panels, as well as the lower GPU grunt needed to actually drive it.
 
Last edited:
Depending on what the summer brings I think I'll go for the XR341CKB

I think my 970 will drive games at 1080p fine on it. I don't have a gsync monitor so I don't know what I'm missing. I want the 1440p as I don't use my pc purely for gaming, I will need to read text.

I won't be getting a 980ti but will wait for pascal, which if rumor be believed should support the freesync. Failing that the other half will get a lovely hand me down :)

I really cannot justify the premium for the X34 at this time, I could upgrade a few other bits for that difference.
 
I won't be getting a 980ti but will wait for pascal, which if rumor be believed should support the freesync. Failing that the other half will get a lovely hand me down :)

I really cannot justify the premium for the X34 at this time, I could upgrade a few other bits for that difference.

Where did you hear that the pascal will support freesync?
 
Another TV show where they show of 21.9 monitors! :D That is 3 TV shows now where 21.9 monitors have been shown of a bit, legends of tomorrow, heroes reborn (the bad woman has one as her office monitor) and arrow (Felicity has about 3 separate setups each with 3 dell 34" monitors)

jSIhJsf.png
 
Last edited:
They never will. They might... with a small % chance.. support Adaptive Sync.. Big difference.

Well yes that was what I meant, but if they support adaptive sync then they would work on freesync monitors as far as I know. I assume eventually in about 2-5 years, they will have no choice, but they might still have Gsync as a high end option I don't know.
 
i cant see nvidia supporting freesync any time soon but i would not be surprised to see a hack for freesync in the near future

Thats because they never will. Its incredible how many confuse Freesync and Adaptive sync. nVidia might in the future, if forced to, support Adaptive sync. However nVidia will/can never support freesync, why?, because Freesync is the software solution AMD uses to take advantage of Adaptive Sync. BIG DIFFERENCE. :)
 
I'm not sure why Nvidia would support adaptive sync. The gsync solution is clearly better, it just works, and has a better range.

only reason i can think of is sales, with a freesync monitor myself im pretty much locked out of nvidia, never had a problem using amd or nvidia and in the future it would be nice to have the choice and im sure nvidia would be more than happy to take my money.

if gsync is superior there is no real reason nvidia could not support both and carry on charging a premium .

but then who knows, im happy enough to carry on using amd and its up to nvidia to decide if they want the custom from people like me with freesync monitors
 
I'm not sure why Nvidia would support adaptive sync. The gsync solution is clearly better, it just works, and has a better range.

Except it is not... as PCM2 has said:

No. ;) They are both equally as effective at eliminating tearing, juddering and stuttering from the traditional refresh rate and frame rate mismatches. I've seen some fairly good pixel overdrive implementations for both FreeSync and G-SYNC models and also some not so good ones. And if there are any latency differences they're certainly beyond my sensitivity for that sort of thing.

And plenty of other tests/feedback from "non-bias" people/reviewers that say pretty much the exact same.

The freesync range depends on the monitor used and either way, according to PCM2, it really doesn't matter since AMD have LFC and as I have said many times before you really don't want to be hitting any lower than 40 fps max anyway, g/free sync is not a silver bullet like what many make it out to be.

Low frame rates are low frame rates regardless. You really don't want to be getting under 56fps or anywhere near that on a 144Hz monitor, it feels and looks extremely sluggish. And besides, now that AMD has LFC the frame rates below the hardware floor (56Hz/56fps) are suitably compensated for to remove stuttering and tearing. It's very much a non-issue really.

Either way adaptive/free sync, whatever you want to call it is not going anywhere, not when it is an open standard and will be used by intel in the future as well as HDMI having adaptive sync. The only way nvidia would ever be able to avoid adaptive sync is to not include anything higher than DP 1.2 in their future GPUs (which means cutting of a lot of future high end monitors i.e. 4k with 100+HZ), paying monitor manufacturers to disable it on the firmware side (unlikely especially when intel will have support) or disabling it via their own drivers (no doubt some kid will write a program to enable it though)

Gsync will die, it is only a matter of time, I can see it becoming a niche product and only used on the top end monitors, it will just end up the same way that nvidia 3d vision went.
 
Last edited:
Bought Star Wars Battlefront 2 from the Steam May the 4th be with you deals and it supports 3440x1440 resolution natively! Didn't even have to go into the options menu. The space battles look sweet even for such an old game.
 
Back
Top Bottom