LG 34GK950G, 3440x1440, G-Sync, 120Hz

(I assume you mean the 34GK950G)

Oooh, you are teasing us! This comes as somewhat of a surprise, since there has been no mention previously that review units had been sent out yet.

I'm not sure why the cloak & dagger... I think LG have already lost sales here because the availability of the 950G and 950F appears to be an unknown, which I think contributed to many purchasing the Alienware when it was on sale recently. Better communication, like "coming very soon, we have sent out review units already, we're nearly there" would have influenced some people.

No, I messed up with the model number. My post was in relation to G-SYNC and FreeSync more generically and not specific to this thread. I meant 32GK850G and 32GK850F, I accidentally conglomerated the 34GK950G in there as well. Will edit the post for clarity.
 
No, I messed up with the model number. My post was in relation to G-SYNC and FreeSync more generically and not specific to this thread. I meant 32GK850G and 32GK850F, I accidentally conglomerated the 34GK950G in there as well. Will edit the post for clarity.

:eek: :(
 
I do not have much experience with freesync, but also from what I have read I am pretty sure Gsync is better in most monitors, just not very impressed by the current available modules, Not sure why they don't have a mid range DP 1.4 update to the current DP 1.2 module. Nvidia do make good products but they are pretty annoying with their tactics. For example yes OK Gsync is better, so people would still want a Gsync monitor even if they enabled VRR support, would be nice to have the option.
 
I do not have much experience with freesync, but also from what I have read I am pretty sure Gsync is better in most monitors, just not very impressed by the current available modules, Not sure why they don't have a mid range DP 1.4 update to the current DP 1.2 module. Nvidia do make good products but they are pretty annoying with their tactics. For example yes OK Gsync is better, so people would still want a Gsync monitor even if they enabled VRR support, would be nice to have the option.

Gsync is not "better" than Freesync. They are both the same. There are some bad Freesync monitors out there true, but nothing to do with Freesync just manufactures cutting corners.
That is why Freesync 2 exists, where AMD is applying more strict rules. Yet there are still good Freesync 1 monitors out there, and great prices.
 
I'll have to disagree with much of the above. From initial release until today, G-Sync, thanks to the G-Sync module, has provided superior functionality and performance compared to Freesync.

At release, Freesync monitors couldn't even enable overdrive and Freesync simultaneously, making the monitors virtually useless as high performance gaming displays with Freesync enabled. Then, when Freesync monitors finally could use overdrive and Freesync simultaneously, the overdrive functionality only worked optimally near the max refresh rate. This is still the case over halfway through 2018! That's a stark contrast compared to G-Sync monitors which have had adaptive overdrive functionality from inception.

G-Sync also provides overclocked refresh rates, extended VRR ranges, superior performance at the VRR range boundaries, etc.

I'll gladly pay the "G-Sync tax" for the premium experience. It's unfortunate the 950G incorporates the first gen G-Sync module, but no doubt we'll see the updated module in future ultrawides.

Maybe we'll see some Freesync 2 monitors finally matching G-Sync's capabilities, but I'm not holding my breath for anything soon. From what I understand, it's up to the monitor manufacturers to individually implement the various features available.

I'd suggest reading a few of my reviews and you'll soon get a feel for the differences. I'm not going to indulge in a point to point discussion, but I very much agree with what newtoo7 above says. The biggest issues I have with FreeSync relate to the poor pixel overdrive implementation. With Nvidia G-SYNC the board specifically tunes things for a range of refresh rates. With FreeSync monitors things are typically quite well tuned for the highest static refresh rate, but as that decreases you get more obvious overshoot. The pixel responses should loosen off to prevent this - there's no point in having such high levels of overdrive at lower refresh rates, it's undesirable. I've also seen several examples of G-SYNC variants of monitors being much better tuned at the highest possible refresh rate than FreeSync variants. With the FreeSync variants using insufficient overdrive for the higher refresh rates (and, ironically, too much for lower refresh rates). Some of the 240Hz models and the LG 32GK850G vs. F reinforces this. Just wait for my review coming later today! With G-SYNC the floor of operation is always 30Hz, whereas for FreeSync models (FreeSync 2 or otherwise) it could be anything really. At least FreeSync 2 mandates LFC, but that doesn't work flawlessly in particular where the FreeSync floor is high and stuttering at the boundary is obvious. The 32GK850F reinforces this point beautifully.

So you see, I am indeed very experienced with both technologies. And the more I use both the more I agree that G-SYNC is the more polished of the two. There are some good FreeSync models out there, don't get me wrong, and I recommend some of them. But to think G-SYNC is a pointless additional expense is wrong. Nvidia are far more involved with tuning things and the results speak for themselves. AMD just leaves the monitor manufacturers to do what they want and more often than not that's a bad thing. Whether this level of careful pixel overdrive tuning could be achieved without G-SYNC is debatable, because I've yet to see it. In an ideal world there would be no G-SYNC and the monitor manufacturers would be really careful with their pixel overdrive tuning and assess it and re-tune over a broad range of refresh rates. That alone may well require specialist hardware with a G-SYNC board, I'm not sure. But the proof of the pudding is in the eating. I prefer to deal with what is out there in the real world vs. theory.

P.S. You don't know exactly what I do for a living, there's a lot more to my life than monitors. Although I can tell you I have no affiliation with either Nvidia or AMD.

I have to agree. G-sync has very strict standards and is a serious product, while FreeSync has no standards and is slapped on everything nowadays. This results in average implementation below usable threshold. Thats not far from what you would realistically expect from a free product that even has 'Free' in its name, but... Anyway, serious product has to cost money, there is no way around it, complaining that G-sync costs money is fundamentally wrong. If you earn your money very hard then it is especially important to make sure that your relatively huge spending is worth it, and this is achieved only by paying the price for quality product, not by making unreasonable and irresponsible savings by buying half-ass products.
 
Gsync is not "better" than Freesync. They are both the same. There are some bad Freesync monitors out there true, but nothing to do with Freesync just manufactures cutting corners.
That is why Freesync 2 exists, where AMD is applying more strict rules. Yet there are still good Freesync 1 monitors out there, and great prices.

Yes Freesync can be good and bad, same as there are some bad G-Sync monitors, although in those cases it's other issues that make them bad such as terrible panels beset with horrendous bleed. Bottom line though, and mostly needless to say, if you have an Nvidia card, you simply can't use Freesync no matter how well it may be implemented.
 
I have to agree. G-sync has very strict standards and is a serious product, while FreeSync has no standards and is slapped on everything nowadays. This results in average implementation below usable threshold. Thats not far from what you would realistically expect from a free product that even has 'Free' in its name, but... Anyway, serious product has to cost money, there is no way around it, complaining that G-sync costs money is fundamentally wrong. If you earn your money very hard then it is especially important to make sure that your relatively huge spending is worth it, and this is achieved only by paying the price for quality product, not by making unreasonable and irresponsible savings by buying half-ass products.

That is why Freesync 2 exists which is more strict.
Since we are at the LG discussion look at the LG 32GK850F and compare it to LG 32GK850G (gsync).
The Freesync 2 version of the monitor is far superior to the Gsync version when comes to image quality and stuff. And these are down to the limitations of the gsync module.
 
Nope. Your calculation is wrong.
3440x1440x100 = 495,360,000 pixels per second or 4.95 Gbps for 1 channel.

4.95 x 3 channels
= 14.85gbps in total.

Use come calculator you will see I am correct

https://k.kramerav.com/support/bwcalculator.asp
Okay, but a gigabit is 1x10^9, however 495,360,000 in standard form would be 0.495x10^9, or half a gigabit. So how are you getting ten times that, even if multiplying by 3 pixels?
 
Okay, but a gigabit is 1x10^9, however 495,360,000 in standard form would be 0.495x10^9, or half a gigabit. So how are you getting ten times that, even if multiplying by 3 pixels?
Told you if you do not believe the normal calculator, check the websites with specialized calculators :)
But yes, because you forget to calculate 10 bits per clock.

Check the link I put above. Has the full math in detail.
 
I do not have much experience with freesync, but also from what I have read I am pretty sure Gsync is better in most monitors, just not very impressed by the current available modules, Not sure why they don't have a mid range DP 1.4 update to the current DP 1.2 module. Nvidia do make good products but they are pretty annoying with their tactics. For example yes OK Gsync is better, so people would still want a Gsync monitor even if they enabled VRR support, would be nice to have the option.

To be fair, until the last couple of months, the most demanding monitors have all been suitably supported by DP1.2, and we are only now getting panels that need more than 18 Gbit/s.

1920 x 1080 at 240Hz
2560 x 1440 at 165Hz
3440 x 1440 at 120Hz

The new module is a big step up, supporting 4k at 144Hz (36 Gbit/s), and the cost makes it inhibitory for monitors <£1000 right now. I agree that it would be good to have an intermediate model suitable for newer panels capable of 3440 x 1440 at 144Hz and 2560 x 1440 at 200Hz, which would need around 21-22 Gbit/s.
 
Yes Freesync can be good and bad, same as there are some bad G-Sync monitors, although in those cases it's other issues that make them bad such as terrible panels beset with horrendous bleed. Bottom line though, and mostly needless to say, if you have an Nvidia card, you simply can't use Freesync no matter how well it may be implemented.

This is generally the main factor... if you want good GFX performance, most people choose an Nvidia card, which leads you to a G-Sync monitor.
 
Gsync is not "better" than Freesync. They are both the same. There are some bad Freesync monitors out there true, but nothing to do with Freesync just manufactures cutting corners.
That is why Freesync 2 exists, where AMD is applying more strict rules. Yet there are still good Freesync 1 monitors out there, and great prices.

What was previously said about things like overdrive, freesync range, various issues etc. overall if you get a Gsync, it is much more likely to be better and do what it says on the tin. Maybe this will change in the future, but currently this is true.
 
What was previously said about things like overdrive, freesync range, various issues etc. overall if you get a Gsync, it is much more likely to be better and do what it says on the tin. Maybe this will change in the future, but currently this is true.

Is exactly the same with freesync, if you read the tin.....
 
To be fair, until the last couple of months, the most demanding monitors have all been suitably supported by DP1.2, and we are only now getting panels that need more than 18 Gbit/s.

1920 x 1080 at 240Hz
2560 x 1440 at 165Hz
3440 x 1440 at 120Hz

The new module is a big step up, supporting 4k at 144Hz (36 Gbit/s), and the cost makes it inhibitory for monitors <£1000 right now. I agree that it would be good to have an intermediate model suitable for newer panels capable of 3440 x 1440 at 144Hz and 2560 x 1440 at 200Hz, which would need around 21-22 Gbit/s.

Yes that is exactly what I meant, a lot of people would be happy with a 3440 144hz or a 2560 200hz etc. The current DP 1.2 module is not good enough for that. I think the majority of monitor users actually want a fast monitor rather than a HDR monitor. And that current HDR1000 module is going to be at least 1.5-2k for any monitor, which for most people is too much for a PC monitor.
 
Is exactly the same with freesync, if you read the tin.....

Don't think it is though pal, there are various issues with most freesync monitors as other people have said. In future hopefully freesync will be perfect and also Nvidia will support it etc.

To add to this I do not think Gsync is good value though at all, and the new module is even worse.
 
Told you if you do not believe the normal calculator, check the websites with specialized calculators :)
But yes, because you forget to calculate 10 bits per clock.

Check the link I put above. Has the full math in detail.

Ah right that makes sense - what's the 10 Bits per clock for?

I wasn't dismissing your maths btw, just wanted to see where I was going wrong :)
 
Back
Top Bottom