Monitor advice

Associate
Joined
24 Sep 2011
Posts
477
Location
Ashford, Surrey
It's about time I replaced my Asus VE247H, so I've been on the hunt.
Before I start it's worth mentioning that before I upgrade my monitor I'm
going to get a GTX1070.

I knew I wanted to go 1440p so I started looking at 27" 1440p monitors
and then I came across 21:9 monitors which I really like the look of. The
problem is a 1440p 21:9 monitor is out of my budget, meaning if I decided
to go with one it would be 2560x1080.

So my question is, what do you guys recommend?

27" 1440p
Or 29" 21:9 at 1080p?

Budget is roughly £350 if it helps.
 
IMO, you should also include a FreeSync or G-Sync as a requirement/preference, as well as 144Hz.

Then again, with the 1070 you would be looking at G-Sync, and that unfortunately brings a price-premium, so £350 would limit the selection drastically. Cheapest 1440p will be AOC Agon AG241QG for ~£430, but that's 24" TN (though also 165Hz). Cheapest 21:9 will be Acer Predator Z301 for £650 (30", VA, 2560x1080, 200Hz (!)).

FreeSync would be cheaper, and will most likely win the "standard wars", from the looks of it. For £400, you can get Acer XF270HU (27", IPS, 2560x1440, 144Hz). For £500, you could get the AOC C3583FQ (35", VA, 2560x1080, 160Hz).

If you're okay with 1920x1080 on a 27", then there's the Acer XZ271 for £300 (VA, 144Hz, Freesync).

Considering you already have a 380, you could forget about the 1070, purchase a FreeSync monitor now, and save the price-premium for an even more powerful AMD card, for when the new generation arrives.

Well, considering your budget probably won't allow a G-Sync in any case (unless you're willing to make compromises on your original requirements), so you might still be looking at FreeSync monitors, even if you decided to go with 1070. The price difference isn't that big with FreeSync vs regular monitors, after all, so no reason to NOT buy one.

So there's few options, for starters. Other people can give more.
 
Sorry I can't be of help dude but I am in exactly the same situation as you. Don't know what to get.

On balance I would probably go with 1440p IPS. (16:9)
 
FreeSync would be cheaper, and will most likely win the "standard wars", from the looks of it.

Hope I'm not derailing the thread, but I'm just curious what makes you say that, I was just starting to look at getting a g-sync monitor with a 1070 in the near future.
 
Check this post, I collected some market data in October:
https://forums.overclockers.co.uk/showthread.php?p=30092340

Keep in mind, FreeSync is more appealing not only for customers, but also for manufacturers; The nVidia-tax is quite a tangible part of the overall manufacturing costs for G-Sync monitors. For manufacturers, the profit margin decreases, and thus their percentual return-on-investment suffers.

Whereas with FreeSync (/Adaptive-Sync, which FreeSync is based on), you can implement it to your monitors for next to nothing, so there's not much sense to NOT do it. That's also one of the main reasons why most manufacturers offer it more often than G-Sync.

As for customers, it's naturally more compelling to purchase a monitor that costs £100-200 less, but offers the same features. The same price saving could be transferred to purchasing a more powerful GPU, which in case of FreeSync, is by default AMD's.

Also, because the price difference versus a regular monitor is relatively small, there's no compelling reason NOT to purchase a FreeSync monitor, if the other alternative is one without a variable refresh rate, at all. And because of this, more manufacturers will have to implement it, otherwise they will become the "runner-up" in comparisons.

So, if you're an AMD GPU owner, there's no reason to purchase a G-Sync monitor. But if you're an nVidia GPU owner, and you don't want to pay extra for the G-Sync, then there's no logical reason NOT to buy a FreeSync monitor, instead. The FreeSync option will even remain for a potential future AMD GPU purchase, whereas you can for the time being use it as a regular monitor, in any case.

As such, nVidia is actually taking a huge risk, when they refuse to even support Adaptive-Sync. Currently FreeSync monitors and AMD GPUs are perfect complementary products. nVidia's reluctance, which drives more customers and thus manufacturers towards FreeSync, can also drive customers towards AMD's GPUs, as well.

I would reckon that nVidia has smart people in their marketing department, who should be aware of this. So the bigger question is: do they have something up their sleeve, that the market is yet unaware of? Or maybe they're just hanging on to a foolish hope, who knows... But the situation surely ain't looking good for G-Sync, at the moment.

Even Intel has already given their support for Adaptive-Sync, so my take on the situation is that nVidia will eventually just have to swallow their pride and at least start supporting Adaptive-Sync.
 
Check this post, I collected some market data in October:
https://forums.overclockers.co.uk/showthread.php?p=30092340

Keep in mind, FreeSync is more appealing not only for customers, but also for manufacturers; The nVidia-tax is quite a tangible part of the overall manufacturing costs for G-Sync monitors. For manufacturers, the profit margin decreases, and thus their percentual return-on-investment suffers.

Whereas with FreeSync (/Adaptive-Sync, which FreeSync is based on), you can implement it to your monitors for next to nothing, so there's not much sense to NOT do it. That's also one of the main reasons why most manufacturers offer it more often than G-Sync.

As for customers, it's naturally more compelling to purchase a monitor that costs £100-200 less, but offers the same features. The same price saving could be transferred to purchasing a more powerful GPU, which in case of FreeSync, is by default AMD's.

Also, because the price difference versus a regular monitor is relatively small, there's no compelling reason NOT to purchase a FreeSync monitor, if the other alternative is one without a variable refresh rate, at all. And because of this, more manufacturers will have to implement it, otherwise they will become the "runner-up" in comparisons.

So, if you're an AMD GPU owner, there's no reason to purchase a G-Sync monitor. But if you're an nVidia GPU owner, and you don't want to pay extra for the G-Sync, then there's no logical reason NOT to buy a FreeSync monitor, instead. The FreeSync option will even remain for a potential future AMD GPU purchase, whereas you can for the time being use it as a regular monitor, in any case.

As such, nVidia is actually taking a huge risk, when they refuse to even support Adaptive-Sync. Currently FreeSync monitors and AMD GPUs are perfect complementary products. nVidia's reluctance, which drives more customers and thus manufacturers towards FreeSync, can also drive customers towards AMD's GPUs, as well.

I would reckon that nVidia has smart people in their marketing department, who should be aware of this. So the bigger question is: do they have something up their sleeve, that the market is yet unaware of? Or maybe they're just hanging on to a foolish hope, who knows... But the situation surely ain't looking good for G-Sync, at the moment.

Even Intel has already given their support for Adaptive-Sync, so my take on the situation is that nVidia will eventually just have to swallow their pride and at least start supporting Adaptive-Sync.

Some interesting analysis on the issue. If they do, how do you think Nvidia will 'exit' the market with G-sync?
 
Last edited:
As such, nVidia is actually taking a huge risk, when they refuse to even support Adaptive-Sync. Currently FreeSync monitors and AMD GPUs are perfect complementary products. nVidia's reluctance, which drives more customers and thus manufacturers towards FreeSync, can also drive customers towards AMD's GPUs, as well.
I've been saying this for a while. Gsync is ultimately a losing proposition even if it does offer slightly superior capability, and I feel they will ultimately have to capitulate and join the Freesync revolution or they're just driving people into AMD's hands. Ultimately, their decision to make this whole variable refresh rate thing a hardware-exclusive technology will bite them in the butt.

That said, we should be forever grateful for them for pushing variable refresh rate displays in the first place. AMD had seemingly been sitting on the tech for a while and did nothing, only taking action once Nvidia pushed for it in gaming-focused monitors.
 
Some interesting analysis on the issue. If they do, how do you think Nvidia will 'exit' the market with G-sync?
Gradual drop off in Gsync product releases and then Freesync support announcement.

We can already see a HUGE drop-off in new Gsync products, while there are many more Freesync monitors being offered. This is especially noticeable since we're seeing plenty of lower end monitors get Freesync support, which is going to target a much bigger audience than enthusiast products.

I dont think it'll be long now. Maybe another year or two.
 
Gradual drop off in Gsync product releases and then Freesync support announcement.

We can already see a HUGE drop-off in new Gsync products, while there are many more Freesync monitors being offered. This is especially noticeable since we're seeing plenty of lower end monitors get Freesync support, which is going to target a much bigger audience than enthusiast products.

I dont think it'll be long now. Maybe another year or two.

One thing I often see on free-sync monitors is a rather pitiful active range (usually starting from 48hz) whereas g-sync starts much lower. Correct me if I'm wrong, but is g-sync a better solution tech-wise to free-sync (due to lower active starting range).
 
One thing I often see on free-sync monitors is a rather pitiful active range (usually starting from 48hz) whereas g-sync starts much lower. Correct me if I'm wrong, but is g-sync a better solution tech-wise to free-sync (due to lower active starting range).
Yea, from what I've seen, the majority of Freesync monitors have a pretty poor range. When the most useful range for variable refresh rates are 30-60hz, having a lower limit of 48hz is not terribly praise-worthy, though still better than not having it.

That said, Freesync *should* be capable of much better than than, all the way down to 20hz, which is even better than the 30hz lower limit of Gsync. I'm not sure exact what's causing the disparity issues here, but hopefully market forces can encourage better options in the future. I'm worried that not enough people are informed to drive those market forces, though. I rarely see people who encourage people to go Freesync talk about usable range.
 
(seems like southernorth removed the extra questions, but I'll keep the answer intact)

Like Seanspeed said, G-Sync will probably just be phased out of the monitor market, like the 3D Vision kits are, for example. Do they even sell those kits, anymore? Have they already been phased out? I couldn't find them on OcUK, at least.

I would wager that the GPUs will keep the G-Sync support slightly longer than manufactures make new monitors based on it (just like 3D Vision support is probably still maintained at GPU driver level), so current owners shouldn't be affected much. But for the lifetime of their current monitor, they will surely be locked-in to nVidia GPUs, if they want to use the variable refresh rate feature. Unless nVidia opens the software portion of the G-Sync operability, so that AMD can use that, as well. Which is highly unlikely. Not sure whether G-Sync requires any specific chip in the GPU side.

At least for FreeSync, I'm not aware of any specific chip that would be required in the the GPU. I think/hope it's more of a software/firmware implementation on the GPU side. Furthermore, the Adaptive-Sync is part of the DisplayPort standard, so nVidia could start supporting it anytime they wanted. Actually, I think I read somewhere that the G-Sync nVidia offers on laptops is actually an implementation of Adaptive-Sync, and not the "real" G-Sync...? Not sure how accurate that info was.
 
As for the ranges, I've started wondering about that quite a lot. I'm wondering why would the lower range matter, at least on 144Hz monitors? Because the monitor could always double or triple the lower fps frames, so it could just use 48/72/96/120/144Hz for 24fps, or 60/90/120Hz for 30fps, or 80/120Hz for 40fps. Because isn't the whole idea to get the frame as fast to the monitor, as possible? In this sense, only the higher refresh rates should matter. The refresh rate is still variable, so it shouldn't matter as long as the "ready" frame gets delivered as soon as it's ready. Right...?

Or maybe I'm just over-thinking it. Or over-simplifying it, who knows. But I have a practical example to support my theory: My FreeSync monitor (XZ321Q) has a built-in OSD feature that shows the refresh rate in real time on the top right of the screen. And at least in Heaven benchmark, when the fps dips down to 30-40, the OSD's refresh rate reading variates in the 70's. And my monitor has a 48-144Hz FreeSync range.
 
Back
Top Bottom