LG 34GK950G, 3440x1440, G-Sync, 120Hz

Associate
Joined
30 Oct 2018
Posts
3
Baddass first grate review as always! I'm in the market for a widescreen monitor.

If you could chose of the following monitors which one would you have picked from the - LG 950G/F variant, Acer X34P, Dell AW3418DW.

I have a G-force card but the freesync does look appealing. It may be a gamble if AMD will release a high end card but the 144 Hz will still be useful with Nvidia.
 
Associate
Joined
31 Oct 2018
Posts
2
So i'm currently deciding on buying the 34GK950F or the G model

I have a EVGA 2080ti ftw3 ultra (Got it today :D)



I know the nvidia card can't use free-sync obviously. My question is actually in regards to someone mentioning that the free-sync can be turned OFF on the F model monitor. Which apparently reduces the input lag.



I have also heard at high frame rates free-sync/g-sync are far less noticed. The only time i care to be getting 144hz/fps. is when im playing my competitive games such as CS:GO, or Rocket League. And with a 2080ti, I don't think i'll have issues clearing even 200-300 fps as i was already doing that with a 970 and 6700k at 1080p. so im sure i won't fluctuate in FPS very much on those 2 games. and should easily be above the 144 goal. (and any other game i play i think ill be able to stay stable around 144hz with the 2080ti depending on settings, so i dont think ill have any tearing, hopefully........?)



So is it true that i can turn off free-sync on the free-sync monitor, so i won't be using free-sync, I won't be using g-sync, I won't be using V-sync. It will just be a basic 144hz 3440x1440 monitor. no adaptive sync technology at all, which if possible, should decrease the input lag correct? (by how much i don't know of course)



(someone said they mentioned this in the tft post but i have read it up and down and i must have passed it, can't find it. sorry in advance...)


Also doesnt the free-sync model have MBR? is that also usable while having free-sync turned off. And shouldn't that kind of out-way the g-sync model in another catagory considering it doesnt have it.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,571
Location
UK
You can absolutely turn FreeSync off on the F model, and in fact you wouldn’t be able to turn it on in the first place if you were using an NVIDIA graphics card. However it makes no different to the lag I’m afraid. Not sure where that rumour has come from.

If you’re going to be able to consistently push high frame rates and see little fluctuation for your chosen games then there is less benefit in a variable refresh rate tech ass you’ve said. I’d just advise caution as what if you wanted to then play a new more demanding game where frame rates did vary and fell to the lower end? That’s where VRR is very useful. Otherwise yes you could play without any VRR or vsync on if you wanted.

The F model has MBR mode whereas the G model doesn’t. So at 144hz that can be turned on to provided motion blur reduction benefits. Some people really like those features, some don’t like the flickering and reduced brightness. But def worth trying out
 
Associate
Joined
17 Aug 2017
Posts
156
I'm pretty sure the TFT review said that Freesync couldn't be turned off in this model.
But your right, enabling MBR mode should turn it off though. Whether that's enough to remove any perceived inpit lag remains to be seen.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,571
Location
UK
Baddass first grate review as always! I'm in the market for a widescreen monitor.

If you could chose of the following monitors which one would you have picked from the - LG 950G/F variant, Acer X34P, Dell AW3418DW.

I have a G-force card but the freesync does look appealing. It may be a gamble if AMD will release a high end card but the 144 Hz will still be useful with Nvidia.

It depends on a few things which you’d need to ponder. Firstly how powerful is your graphics card and system? What kind of frame rates are you going to be able to reliably produce for your chosen games and settings? Will they vary and drop to lower range in some situations? This is important when trying to decide potentially between the gsync models and possibly the 950F.

Are you just gaming or have you got any need to do colour critical work, photo editing, colour matching between devices or anything like that? May influence choice in relation to the wide gamut vs standard gamut selection

Let me know and I can try and advise further....
 
Man of Honour
Joined
12 Jan 2003
Posts
20,571
Location
UK
I'm pretty sure the TFT review said that Freesync couldn't be turned off in this model.
But your right, enabling MBR mode should turn it off though. Whether that's enough to remove any perceived inpit lag remains to be seen.
You can turn it off in the OSD menu if needed. But it wouldn’t be active anyway from an NVIDIA card :)
 
Associate
Joined
17 Aug 2017
Posts
156
Oh, that's good news. But do you know if it would eliminate input lag?

Also, would it eliminate other FreeSync2 features such as HDR?
 
Associate
Joined
7 Sep 2018
Posts
6
However it makes no different to the lag I’m afraid. Not sure where that rumour has come from.
While it may be applied incorrectly, the source of the rumor is likely AMD itself:

Radeon FreeSync 2 technology – One step closer to pixel perfect smooth gaming
  • Guaranteed support for Low Framerate Compensation (LFC)
  • Support for displaying HDR content
  • Low latency
https://www.amd.com/en/technologies/free-sync
 
Man of Honour
Joined
12 Jan 2003
Posts
20,571
Location
UK
Oh, that's good news. But do you know if it would eliminate input lag?

Also, would it eliminate other FreeSync2 features such as HDR?
Disabling it has no impact on lag I’m afraid. There’s no “real” HDR support from the screen anyway and the bits it does have like slightly higher luminance than some screens and a wide gamut are available all the time.
 
Associate
Joined
5 Oct 2018
Posts
90
Disabling it has no impact on lag I’m afraid. There’s no “real” HDR support from the screen anyway and the bits it does have like slightly higher luminance than some screens and a wide gamut are available all the time.
But isn’t the point of HDR that the luminance goes up in bright scenes (up to max)? Whereas a monitor calibrated to 102 cd will not hit max brightness in the brightest scenes?

In other words, doesn’t HDR boost bright scenes to higher brightness than would otherwise be reached if the monitor was calibrated to a lower brightness setting?
 
Associate
Joined
30 Oct 2018
Posts
3
It depends on a few things which you’d need to ponder. Firstly how powerful is your graphics card and system? What kind of frame rates are you going to be able to reliably produce for your chosen games and settings? Will they vary and drop to lower range in some situations? This is important when trying to decide potentially between the gsync models and possibly the 950F.

Are you just gaming or have you got any need to do colour critical work, photo editing, colour matching between devices or anything like that? May influence choice in relation to the wide gamut vs standard gamut selection

Let me know and I can try and advise further....

Thanks for the constructive feedback, much appreciated!

I'm going to buy a new system in the coming three months with a 1080ti minimum gpu performance.

I'm using my computer for office work papper exercises, gaming and youtube. No color critical work.

When i game i dont go for maximum quality. I'm more after normal settings where the framerate are as high as possible. And in this case atleast 100 fps +, have a 980 ti for the moment.
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,571
Location
UK
But isn’t the point of HDR that the luminance goes up in bright scenes (up to max)? Whereas a monitor calibrated to 102 cd will not hit max brightness in the brightest scenes?

In other words, doesn’t HDR boost bright scenes to higher brightness than would otherwise be reached if the monitor was calibrated to a lower brightness setting?

well thats one pretty small part of the HDR content equation. but if you set the monitor to 100% brightness in non-HDR mode you could still reach 400 cd/m2 maximum brightness, and content that is supposed to be darker would still appear darker on the screen. You'd get the same contrast ratio between light and dark areas as you would if you were sending HDR content and letting that dynamically control the backlight intensity. On the 950F even with an HDR input there seemed to be an issue reaching above 230 cd/m2 as well which was odd. i didn't bother testing it much more as it was, in my opinion, largely worthless.

my point was more that without local dimming support, you arent ever improving the perceived contrast ratio / dynamic range anyway which is the whole basis of HDR.
 
Associate
Joined
14 Dec 2010
Posts
70
I just got mine in this afternoon, I'll be posting some impressions soon but would like to test performance in a dark room first, it's still light out here. I'll be comparing it to my 38UC99.
 
Associate
Joined
17 Aug 2017
Posts
156
I'd also like to see a comparison test in an averagely lit room (ie is IPS glow, BLB visible).
I don't normally play in the dark, but I do play dark games ..... I watch movies on my plasma TV, long may it live!
 
Back
Top Bottom