Ok thanks. Where has the photo come from?
https://www.blurbusters.com/lg-releases-nano-ips-144hz-with-faster-ips-gtg-plus-4k-120hz-oleds/
Ok thanks. Where has the photo come from?
ok thanks, hadn't seen that before. That menu software matches the 34GK950G (G-sync model) and is a different style to the menu found on the 34GK950F (FreeSync model). another indication in my mind that the 38GL950G is a true G-sync screen.
It could also be they decided to opt out of G-Sync the moment NVIDIA announced FreeSync support and they switched to G-Sync compatible. They may have been caught off guard.ok thanks, hadn't seen that before. That menu software matches the 34GK950G (G-sync model) and is a different style to the menu found on the 34GK950F (FreeSync model). another indication in my mind that the 38GL950G is a true G-sync screen.
in my opinion the evidence for and against this being a normal G-sync screen is
FOR it being a traditional G-sync screen with G-sync module
FOR it being a FreeSync/Adaptive Sync screen just with "G-sync compatible" instead
- G naming convention (38GL950G)
- overclockable refresh rate of 175Hz - never seen any non-Gsync module screen offer anything like this
- OSD menu design and apperance matches 34GK950G and not F model (recent displays from LG)
- only "NVIDIA G-sync" listed on product spec at CES demos (no mention of "G-sync compatible")
- LG coming soon spec page lists it as "G-sync compatable" - although this was, i am 90% sure, already advertised in this way before NVIDIA's announcement about supporting adaptive Sync from some displays
It could also be they decided to opt out of G-Sync the moment NVIDIA announced FreeSync support and they switched to G-Sync compatible.
Please keep it to yourself that I have a dirty mind. Thank youTbh, I'm in awe of that 'nVidia co-op Freesync' move you detailed above! That would be pretty genius (and admirable in a dirty sort of way)
It could also be they decided to opt out of G-Sync the moment NVIDIA announced FreeSync support and they switched to G-Sync compatible. They may have been caught off guard.
i agree with this point, v1 module is not suitable here for this resolution and refresh rateThe v1 G-SYNC module isn't powerful enough to drive this monitor. If the 38GL950G is in fact G-SYNC, then it must be using the v2 G-SYNC module. Now consider this:
i dont believe this to be true. I believe the reason for the active cooling is more due to the FALD backlight, you can tell how how the overall screen gets quite quickly when FALD is enabled for gaming, and i don't believe that is only heat coming from the G-sync module. Where have Asus and Acer specifically said it's cooling for the G-sync v2 module and not related to the FALD?The x27 and PG27UQ require a fan. Both Asus and Acer have confirmed that the fan is an active cooling requirement for the v2 G-SYNC module (not for the FALD backlight)
i think this is jumping to conclusions a bit. no one has had chance to examine this screen in that level of detail as far as i know, open it up or check in case it's got a fan. no one has had chance to push the screen in extended periods of gaming, and identify whether a fan kicks in or not to cool it. there's a chance it may have a cooling fan. there's also a good chance that even if the G-sync v2 module needed additional cooling, that without the FALD on the 38GL950G they could come up with a passive cooling system i expect quite easily. So to say that because we're guessing it doesn't have a fan, that must mean it's not using a G-sync module is a bit rash.From everything we've seen, the 38GL950G has no fan.
You seem to be jumping to conclusions here. Have you not entertained the thought that perhaps Nvidia have more than just 2 types of G-SYNC module?
But you're basing your assumptions on a single line on this product page which doesn't really have a clear meaning anyway and is seemingly contradicted by other evidence.
i dont believe this to be true. I believe the reason for the active cooling is more due to the FALD backlight
Now that we have a better view of the PCB, we can see exactly what the aforementioned blower fan and heatsink assembly are responsible for— the all-new G-SYNC module.
no one has had chance to examine this screen in that level of detail as far as i know, open it up or check in case it's got a fan. no one has had chance to push the screen in extended periods of gaming, and identify whether a fan kicks in or not to cool it. there's a chance it may have a cooling fan. there's also a good chance that even if the G-sync v2 module needed additional cooling, that without the FALD on the 38GL950G they could come up with a passive cooling system i expect quite easily. So to say that because we're guessing it doesn't have a fan, that must mean it's not using a G-sync module is a bit rash.
Tne other additional piece of evidence for the use of a G-sync module here is that the spec announced so far talks only about a single DP and a single HDMI interface. that's a G-sync limitation, and you would expect to see a second HDMI, maybe a USB type-C or other connections featured if it was not using a G-sync module and was in fact a normal adaptive sync screen.
I'll leave you to your opinion then. I have no interest in arguing this.We'll know one way or another soon enough, but to suggest you're not jumping to conclusions is flat out nonsense.
I've never seen such walls of text over such a small detail on a monitor, it's mind-boggling!!!
As long as it works for everyone, it literally doesn't matter. I doubt AMD even care to be honest. Just as long as it doesn't result in repercussions for AMD cards, it's fine.The comical aspect of this is that if this is a G-Sync compatible monitor,NVIDIA have managed to wipe out every trace of AMD from the marketing lol. Not a hint of FreeSync to be seen. Is this how things go from here?
pull it gtx 1080ti if I add it 38GL950G ?