LG 34GK950G, 3440x1440, G-Sync, 120Hz

I'd suggest reading a few of my reviews and you'll soon get a feel for the differences. I'm not going to indulge in a point to point discussion, but I very much agree with what newtoo7 above says. The biggest issues I have with FreeSync relate to the poor pixel overdrive implementation. With Nvidia G-SYNC the board specifically tunes things for a range of refresh rates. With FreeSync monitors things are typically quite well tuned for the highest static refresh rate, but as that decreases you get more obvious overshoot. The pixel responses should loosen off to prevent this - there's no point in having such high levels of overdrive at lower refresh rates, it's undesirable. I've also seen several examples of G-SYNC variants of monitors being much better tuned at the highest possible refresh rate than FreeSync variants. With the FreeSync variants using insufficient overdrive for the higher refresh rates (and, ironically, too much for lower refresh rates). Some of the 240Hz models and the LG 32GK850G vs. F reinforces this. Just wait for my review coming later today! With G-SYNC the floor of operation is always 30Hz, whereas for FreeSync models (FreeSync 2 or otherwise) it could be anything really. At least FreeSync 2 mandates LFC, but that doesn't work flawlessly in particular where the FreeSync floor is high and stuttering at the boundary is obvious. The 32GK850F reinforces this point beautifully.

So you see, I am indeed very experienced with both technologies. And the more I use both the more I agree that G-SYNC is the more polished of the two. There are some good FreeSync models out there, don't get me wrong, and I recommend some of them. But to think G-SYNC is a pointless additional expense is wrong. Nvidia are far more involved with tuning things and the results speak for themselves. AMD just leaves the monitor manufacturers to do what they want and more often than not that's a bad thing. Whether this level of careful pixel overdrive tuning could be achieved without G-SYNC is debatable, because I've yet to see it. In an ideal world there would be no G-SYNC and the monitor manufacturers would be really careful with their pixel overdrive tuning and assess it and re-tune over a broad range of refresh rates. That alone may well require specialist hardware with a G-SYNC board, I'm not sure. But the proof of the pudding is in the eating. I prefer to deal with what is out there in the real world vs. theory.

P.S. You don't know exactly what I do for a living, there's a lot more to my life than monitors. Although I can tell you I have no affiliation with either Nvidia or AMD.

You seem rather offended that i somehow suggested that you did monitor reviews for a kind of living. My only point was that since you had a review site you most likely had some knowledge that some of us, myself included, did not. Not at any point did i claim that you where affiliated with either nVidia or AMD and not at any point was it meant as a negative thing.

You can call me pedantic all you want(not that you specifically did) but my whole issue with Freesync vs Gsync is that people are comparing the tech and what they offer and claims one is superior when its not. The problem is implementation of said tech, in this case Freesync, which as you yourself said has been completely left to the manufacturer to figure out. Nothing is stopping LG, BenQ, Asus or whoever from releasing a Freesync monitor with proper overdrive, but im sure it would increase the R&D cost. The fact that you didn't feel like replying a simple yes or no 3 times to above questions indicates to me that i'm on to something. I stand by my point that it should be a monitor vs monitor comparison and your reply confirms this. Not once did you point out a limitation where Gsync could do a function or feature that a freesync enabled monitor couldn't do on paper. You only mentioned implementations, which btw im not arguing about.
 
I appreciate that's what you meant, it wasn't really your comment that riled me up yesterday. Just had a lot on my plate, so I apologise for taking it out on you when you were just trying to be polite. Unlike some members of society, I'm able to admit when I'm under pressure and I've made mistakes - and say sorry (sorry!) So there you go. :)

I do appreciate where you're coming from regarding FreeSync and G-SYNC. I would much prefer that Nvidia openly supports Adaptive-Sync and perhaps assists in helping the manufacturers make the product itself and the implementation better. However; there are actually some limitations to what can be achieved with FreeSync vs. G-SYNC. I don't think Nvidia are just trolling us by adding their G-SYNC board and using proprietary hardware. I think they bring about some advantages by doing things at this level. Key to this in my mind is the pixel overdrive implementation, which I've mentioned already:

"The biggest issues I have with FreeSync relate to the poor pixel overdrive implementation. With Nvidia G-SYNC the board specifically tunes things for a range of refresh rates. With FreeSync monitors things are typically quite well tuned for the highest static refresh rate, but as that decreases you get more obvious overshoot. The pixel responses should loosen off to prevent this - there's no point in having such high levels of overdrive at lower refresh rates, it's undesirable."


So you might argue that a FreeSync monitor could implement this sort of thing? Well I'm not convinced it could, because I've reviewed many FreeSync and G-SYNC variants of the same product and I almost invariably see issues with pixel overdrive on the FreeSync models that are absent on the G-SYNC model. I'm not an engineer so perhaps this is just the manufacturer being lazy rather than a key physical reason. But I do know that Nvidia tune the overdrive curve for a large range of different refresh rates, whereas 'normal monitors' (including FreeSync models) tend to focus on giving a set number of pixel overdrive settings that are tuned with a single refresh rate in mind.
 
What is this pixel overdrive and how does it affect the image quality? Also PCM2 can you provide a link to your website? is it pcmonitors.info? I would like to read your reviews. Im going to assume that pcmonitors is your site. I really like the reviews. Do you have plans on reviewing the Alienware AW3814DW, by any chance?
 
Last edited:
Yes that's the one. Pixel overdrive = grey to grey acceleration. Used to speed up pixel transition speeds. Too little overdrive gives trailing (ghosting), too much gives overshoot (inverse ghosting). It is important to use an appropriate level for various refresh rates, which G-SYNC models are usually relatively good at.
 
Still no word on price or release date? I see it's shipping in some places, but I'm taking that with a grain of salt since it's only on LG's HK site.
 
Don't think it is though pal, there are various issues with most freesync monitors as other people have said. In future hopefully freesync will be perfect and also Nvidia will support it etc.

To add to this I do not think Gsync is good value though at all, and the new module is even worse.

Let's not forget, Nvidia still hasn't fixed Gsync for the Windows 10 Spring update since that came out.
It is still even on their latest drivers from yesterday an outstanding issue for many people.... Thats now 3+ months going on.
 
Let's not forget, Nvidia still hasn't fixed Gsync for the Windows 10 Spring update since that came out.
It is still even on their latest drivers from yesterday an outstanding issue for many people.... Thats now 3+ months going on.

Eeesh. 300+ pages on the NVidia forums. Is it a particular series of cards this affects?
 
Eeesh. 300+ pages on the NVidia forums. Is it a particular series of cards this affects?
not specific, in general cards from all series and monitors.
Also since September 2016 drivers, there is no sound over DP for many Freesync monitors like the XL2730Z, while any res higher than 120hz it creates flickering.
There is a 1000 page thread there also. Before the September 2016 drivers, there was no issue. I had a GTX1080 between July & November that year, and that issue popped then.
It was the same drivers that broke animated gif and also most of the Facebook content. I remember in 7 days Nvidia released 5 drivers and only the last ones fixed the facebook & animated gif issue. As for the sound, that still wasn't working until last month with my GTX1080Ti Xtreme.

In the mean time all the AMD cards I had in between (Nano, FuryX, V64 Nitro+) had no issues with 144hz nor sound with the monitors.
 
I guess Nvidia is to blame? Imagine this monitor with G-Sync and 144Hz on the new panel. I guess the lack of HDR10 was the reason they were not allowed to use the new GSync module?

I THINK you are right that nVidia is to blame, but for the wrong reasons:

A monitor need not support HDR10 to be "allowed" to use nVidia's new G-SYNC module. That's backwards. You need nVidia's new G-SYNC module to support HDR10! The panel used by the 34GK950G is actually HDR capable, as evidenced by the 34GK950F which (according to Daniel - LG) uses the same UW5 panel, but does support HDR.

So, the 34GK950G's panel could have supported HDR + G-SYNC, but only by using nVidia's new G-SYNC module. However, PCPer estimates that nVidia's new G-SYNC module adds $500 to the cost of a monitor. The 34GK950G will likely already be on the pricey side, so adding another $500 would have priced it out of the market. Unfortunately, the older G-SYNC module only supports DP1.2, which doesn't provide enough bandwidth to go beyond 3440x1440@120Hz. I'd say nVidia has put all monitor OEMs in a pickle. In summary, we must choose between one of these three options:

1) 3440x1440 with older G-SYNC module (DP1.2, NO HDR) at <=120 Hz at a reasonable cost
2) 3440x1440 with FreeSync (DP1.4 WITH HDR) at >= 144Hz at a reasonable cost
3) 3440x1440 with newer G-SYNC module (DP1.4 with HDR) at >= 144Hz which includes an astronomical $500 G-SYNC tax (and requires active cooling)

In my view all of these options suck. Number (2) would be best if AMD was competitive in the GPU space. Until nVidia stops treating G-SYNC as a prototype technology and provides monitor OEMs with real ASICs rather than FPGAs, this situation seems unlikely to improve.
 
Last edited:
Someone on reddit said DP1.2 can't run 120hz fluidly so it needs to use chroma compression to achieve that (which is why its advertised as 120hz overclock). I don't know much about bandwidth and chroma compression so Im not sure what this all means and whether there is legitimacy to his statement.
The guy is correct it drops to 4:2:2, which is not that good for computer games as it crunches the colours. Why? Simple maths.

I agree with @Stu, this is pretty much BS. Maybe the person had a GPU+game where the frame buffer couldn't consistently be updated at 120Hz? That would cause stuttering and will look less fluid, but that's not the monitor's fault.

@Panos The guy on Reddit is wrong, and your math is off. I think I made the same mistake a few posts back. See post #167 for an accurate bandwidth calculation for 3440x1440@120Hz. It amounts to 15.46 Gbit/s, so there is no bandwidth related reason DP1.2 can't handle 3440x1440@120Hz.
 
nvidia should really release a DP 1.4 Gsync module and phase out the DP 1.2 version, it is now out of date and actually making Gsync lower HZ than freesync.

I know they already have the HDR1000 DP 1.4 module, but they should also have a normal DP 1.4 and phase out the DP 1.2.

You want nVidia to phase out the DP1.2 G-SYNC module and replace it with a "normal" DP1.4 module (where normal likely implies "without DisplayHDR 1000"). That makes no sense at all, for two reasons:

1) Any DP1.4 connection can potentially support HDR10, because HDR10 is part of the DP1.4 standard. OEMs can disable HDR but that's just suppressing it. HDR support will still exist in the silicon.
2) It's not HDR support that makes nVidia's DP1.4 HDR G-SYNC module expensive. It's the additional bandwidth that DP1.4 must support to get higher refresh rates, in combination with nVidia's choice to build their G-SYNC module using an FPGA.

What you should be asking for is that nVidia starts building their G-SYNC module using a real ASIC rather than an FPGA.
 
Last edited:
It would make complete sense for Nvidia to replace the 1.2 module with a "basic" 1.4 module and still have the other one for the higher specs. 2 modules, both DP 1.4, one supporting something like 144hz HDR400, the other supporting up to 200hz HDR1000 etc. Clearly this would make massive sense to replace the out of date 1.2 module, while keeping the other module for the 200hz HDR1000 etc.

Probably that expensive module is needed for things like 200hz HDR1000, but also clearly 144hz HDR400 can be done with FREEsync, so it could also be done with a cheaper version of the Gsync module. Currently the Gsync version is actually going to be worse and more expensive than the freesync version so clearly NVidia need to do something there.
 
It would make complete sense for Nvidia to replace the 1.2 module with a "basic" 1.4 module and still have the other one for the higher specs. 2 modules, both DP 1.4, one supporting something like 144hz HDR400, the other supporting up to 200hz HDR1000 etc. Clearly this would make massive sense to replace the out of date 1.2 module, while keeping the other module for the 200hz HDR1000 etc.

Probably that expensive module is needed for things like 200hz HDR1000, but also clearly 144hz HDR400 can be done with FREEsync, so it could also be done with a cheaper version of the Gsync module. Currently the Gsync version is actually going to be worse and more expensive than the freesync version so clearly NVidia need to do something there.

No. You're confused.

You appear to think:

a) DisplayPort revisions are somehow distinct from achievable refresh rates at a given resolution and bit-depth, and/or
b) DisplayPort is somehow impacted by the DisplayHDR level a monitor supports

Both of the above notions are incorrect.

In regard to bandwidth:

Either the connection provides the bandwidth that the DP1.4 spec requires, or it's neither technically nor legally DP1.4. If you build a monitor with a lesser or "basic" DP1.4 connector that doesn't meet the bandwidth requirements (doesn't support HBR3), then you'll simply not find a GPU anywhere that you could connect that monitor to. Even if that would technically work, you'd still have to deal with VESA filing a lawsuit against you for marketing something as DP1.4 that doesn't conform to their specifications.

In regard to DisplayHDR:

As far as DisplayPort is concerned, DisplayHDR 400 and DisplayHDR 1000 are identical! Both require the exact same HDR10 signal to be sent from the GPU to the monitor. In comparison to a DisplayHDR 1000 monitor, there is nothing a DisplayHDR 400 monitor could omit without sacrificing HDR capability entirely (again, only as far as DisplayPort is conerned). In fact, nVidia's G-SYNC module could omit HDR support entirely (which would give us something similar to DP1.3), but it still would cost not a penny less!

In terms of the cost required to implement DP1.4, there is no difference between DisplayHDR 400, DisplayHDR 1000, or even a DP1.4 monitor with no HDR at all.

Where does the cost difference come from then?

I already explained why: the cost difference is a result of much higher bandwidth requirements (of DP1.3 and DP1.4 compared to DP1.2) and nVidia's choice to build their newer G-SYNC module using an FPGA.

The reason why FreeSync monitors don't exhibit the exact same price hikes is because all OEMs purchase their scalars (which include the logic for the DisplayPort connection) from 3rd parties (typically Realtek, Novatek or MStar), which they offer as mass produced ASICs for <$20. If an OEM wants G-SYNC rather than FreeSync, they must purchase the corresponding chip from nVidia. nVidia has a monopoly on that market, so the lack of competition already makes it more expensive. More importantly, nVidia builds their chips using FPGAs rather than an ASICs. Fast FPGAs are expensive. Ones that are fast enough to support the higher bandwidth requirements of DP1.3 and DP1.4 (compared to DP1.2) are insanely expensive ($500).

So no, a "basic" DP1.4 G-SYNC module makes absolutely no sense at all. What would make sense is if nVidia built their DP1. 4 G-SYNC module (the one they already have) as an ASIC, just like everyone else. I can only speculate as to why they don't, but I won't do that here.
 
Last edited:
I do not have the level of knowledge required to know why Nvidia chose a $500 chip, if according to you, a $20 chip would do exactly the same. It does not make sense.

Surely they cannot have used a $500 chip, when a $20 version could pull off exactly the same. There must be good reasons for it. Assuming there are good reasons for this, and knowing that for example 144hz HDR400 CAN be done with current freesync (no current freesync can pull off 200hz HDR1000 etc.). It would make sense to me that 2 different chips might be good. I know HDR10 would be the same signal but surely the different hardware requirements would make a difference to the solution required. Currently there is a big gap in a the market between the DP 1.2 module and the HDR1000 module.
 
Last edited:
For example currently you have :

Nvidia DP 1.2 SDR 100-120hz

*** big gap here, which needs DP 1.4, but not necessarily the mega Nvidia chip ***

Nvidia DP 1.4 HDR1000 module

This is a massive gap in the market that Nvidia are currently missing. If they could have just covered all of this market with a $20 chip they would be mad not to do that!
 
There is indeed a big gap in the market here, but perhaps it's due to no monitor specifically requiring (or asking) for it just yet. We as consumers certainly want it though. The $500 chip seems to have been designed specifically with the Asus Predator X27 in mind. That was a monitor which suffered significant delays in production, and this could have been partly because of that costly G-Sync module. I don't know the ins and outs of the monitor indistry, but it would make sense to me that Nvidia's G-Sync division would work closely with manufacturers. They would have worked with Asus on the $500 chip and not just made it off their own backs in the hope someone would use it. Therefore it logically makes sense that they would do the same with any other much cheaper modules going forwards, as regardless of the cost, $20 or $500, it's going to cost hundreds of thousands if not millions to develop and manufacture these things.

That said, I don't know exactly why the $500 chip is so expensive. It can't be parts... unless they've made it out of gold pressed latinum or something. It's more than likely due to development and manufacture cost, and the fact it's only used in one monitor currently. As they make more, and more monitors might decide to use it, the cost could drop significantly. I'd choose an HDR1000 monitor over anything else if the price was right after all.
 
I do not have the level of knowledge required to know why Nvidia chose a $500 chip, if according to you, a $20 chip would do exactly the same. It does not make sense.

Nowhere did I say that they are exactly the same. Obviously there are differences. You can easily lookup FPGA and ASIC yourself if you want to know more. If you don't have the knowledge to understand the differences, then you should also be willing to admit that you don't have the knowledge to judge how nVidia can make their G-SYNC module cheaper. That is the main issue I take with your posts on this topic, and it's what I'm attempting to clarify.

However, I do understand the comprehension issue you have. I agree that it doesn't make sense from a consumer's point of view. It makes a lot more sense from the point of view of nVidia's hardware developers, but obviously none of us here really care about that.

I also completely agree that an intolerable price gap exists between the DP1.2 and the DP1.4 G-SYNC module, which again, has absolutely nothing to do with DisplayHDR 1000. It's simply the gap between DP1.2 and DP1.4. IMHO if nVidia can't lower the cost of their latest G-SYNC module, then G-SYNC is on a path towards irrelevancy.

FPGA's are typically used for development, when you don't expect high volume sales, or when you're not entirely sure you've got your hardware design right. Normally I'd assume the current FPGA based G-SYNC module represents version 0.9, and that nVidia would soon follow up with version 1.0 which is an ASIC that is based on the current FPGA's design. Unfortunately, that's not how nVidia has dealt with this so far, because every G-SYNC module to date has been FPGA based. Now that the bandwidth requirements of DP1.4 have made nVidia's typical approach prohibitively expensive, maybe that will change. I don't know. However, I absolutely agree that nVidia can't afford to do nothing.

That said, I don't know exactly why the $500 chip is so expensive. It can't be parts... unless they've made it out of gold pressed latinum or something.

;-) Apparently it is "parts". Again, FPGA (costs $2000 if bought individually)!!! Take a look at the PCPer article I linked to above.
 
Last edited:
I do know a bit about ASIC and all of this, I understand what it is, I also know quite a lot about monitors etc. but I am not sure either of us have enough knowledge to say exactly what the Gsync module requires. All I can say is that the current Gsync situation is not ideal and I think we can agree on that.

Hopefully as you say there will be a Gsync 1.0 and it will be an ASIC, although if this was that simple surely they would have done that already, but maybe not who knows. Maybe as you said the FPGA is a sort of prototype and the final Gsync module will be a cheaper version I have no idea.

Does seem a bit odd having such an expensive chip for something that is supposed to be built into displayport 1.4 eg. freesync or VRR whatever you want to call it. But then we have not seen any freesync monitors with 4k120 etc. At the lower bracket though the LG freesync 144hz is actually better than the Gsync 120hz, which is clearly not right.
 
Back
Top Bottom