The Asus ROG SWIFT PG278Q – a 27” 1400p 144Hz Monitor with G-SYNC

... Although I couldn't go back to normal gaming without whinging :)

Coughs*

....As an Owner of the swift I couldn't agree more Greg:cool::D This monitor is amazing:eek:

My Swift and Msi 970's are playing very nicely together:cool::D

Just tried 3D VISION:cool: wow!!! never expected it to be this good:eek:

Did you guys get a borderlands code with your order? I don't seem to have received one?

I've spent ages fiddling trying to get settings I like (and having found this easily on the Samsung 4K, I knew it must be possible on this one too and I wasn't letting "but TN" get in the way)

On the monitor: Brightness 40 (30 at night), Contrast 50, Colour Temp User (R95, G100, B100)

Nvidia Control Panel: Display: Adjust desktop Colour Settings: Use Nvidia Settings
Brightness: 40%
Contrast: 45%
Gamma: 0.8

those are the settings that work best for me, your mileage may vary

These are working great for me as well Andy, thanks:)
 
is this monitor any good for photography ? im contemplating a new monitor i do gaming and photography i really fancy this monitor with the features it has but wondering if itll suit me for photography as well i currently have a benq 24 inch xl2420t ?
 
Without doubt, the Swift. I loved the Samsung 4K and had no intention of going backwards to 1440P but after playing a few games, the smoothness is unbelievable. A little early to pass final judgement after only having the monitor for a couple of hours but if i could record and play back how sweet things run, more people would jump on this.

Crysis 3 I was getting around 65-85 fps with everything maxed out and not a single bit of stutter or tearing and it just looked like watching a film. I am running Tri-SLI Titans mind but I will do some proper single card testing to see how that fairs later on.

First impressions though are "A game changer" and I just need The Witcher 3 to put me in gaming heaven :)

hi mate, compare with Samsung 4k you see lot difference image quality in games?
 
I'm gaming on sli 680s.
They put up a good fight why don't you try the monitor first and see if you need more GPU after?

I am playing wolfenstein the new order which doesn't use two gpus and get 45-60 (game is capped at 60)fps though I do wish sli worked.

In other games you might be surprised what your 680 can do, like others have said depends on the games.

Overclock the memory to 7Ghz using rivatuner and you unlock a lot of beef basically giving you a GTX 770:)

Too long we've been able to overpower console ports, I remember when 35-40fps was considered GOOD/playable when crysis and oblivion came out.
Now I prefer 60+ too don't get me wrong but everyone throwing around 1080/60fps like its cheese and pickle
 
Last edited:
Tempted to buy one of these again, said I'd hold off and with the announcement of nvidia supporting freesync I'd be gutted if a similar model arrived within months at a cheaper cost. Don't know how much cheaper they'd be for the same spec and a different brand without the ROG premium added.

Has anyone tested the monitor on ARMA 3? I ask because it can drop into the 20fps range on some servers.
 
Last edited:
mine came yesterday, awesome screen but it has the inversion problem many seem to have (google "asus rog swift inversion") it's like the individual pixels are way easier to see, it makes it look lower resolution than it is. It's not noticeable on desktop but in motion in games its pretty bad, almost looks vertically interlaced if you get what i mean

anyone from here had this problem? (i know from google that it is common) gonna try and swap mine for another as for the money it is an unacceptable issue

just to note that i've been running 120hz TN panels since the first 22 inch samsung/viewsonics came out so it's not that I'm not used to TN panels or anything like that
 
Last edited:
Nobody played ARMA 3 on this then?

I have. One of the games I've tested so far, but only very briefly. What you want to know?

I can say Gsync isn't the magic bullet I was hoping for in this game (like sevaral others). Does nothing to eliminate the slight judders that can occur.
This is single player only by the way, never tried MP yet in Arma.

Have to say Gsync is continuing to disappoint me more and more. Doesn't work properly in Skyrim, at all, doesn't work properly in BF4 with SLI running, doesn't work porperly in Bioshock (and other unreal engine games it seems) with SLI running.
Gsync was one of the main reasons I bought the ROG, I hope things improve with drivers!
Have to say I'm more impressed with ULMB if you can get a game to run 120 constant vsyncd. Very very CRT quality like
 
Last edited:
ive tried both bf4 and bioshock with 780 sli with max settings - both run perfectly and get 120-143 fps without fail.

My only concern is that the cards are only being utilized around 50-60%
 
I guess if it's maxing out fps to hit what the screen is set to with gsync on then they simply don't need to do any extra work. Bet if you turn gsync off it'll shoot up and you won't see any difference in performance (well, maybe tearing).

edit:
Numpty moment from me to prove how well gsync works. Playing away in bf4 thinking how amaxing the screen was and smooth, didn't realise until a few days later sli was off!
 
Last edited:
ive tried both bf4 and bioshock with 780 sli with max settings - both run perfectly and get 120-143 fps without fail.

My only concern is that the cards are only being utilized around 50-60%

You 100% sure about that? Genuine question. I'm not talking high framerates though here by the way, that's not the issue, I'm talking about Gsync not working properly with SLI running, in some games. There are loads of user reports around the web with people reporting the same as me, especially for BF4.

In BF4 - If I run with one of my Ti's it's lovely and smooth and I can feel and see the benefits of Gsync totally. If I run with both cards in SLI, although the framerate is higher there is perceptible judder and microstutter, totally negating the whole point of Gsync.
 
Last edited:
Back
Top Bottom