Is it still the case that GSync laptops have no additional hardware either end and are actually using adaptive sync (aka freesync)?
Yes
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Is it still the case that GSync laptops have no additional hardware either end and are actually using adaptive sync (aka freesync)?
They have Gsync module, they are just using the pre existing eDP specification.
so this Gsync HDR module seems to get more expensive every time it gets mentioned.![]()
Yeah I notice this too. I can't see NV giving these $500/1000/2500 processors away at a loss.
No, laptops do not have gsync module.
So if Pascal based chips work in Gsync laptops, they must have the required Adaptive sync hardware, because there is certainly no Gsync module.
Also this Gsync HDR module seems to get more expensive every time it gets mentioned.![]()
As I said in my reply you already. Laptops already have this functionality through the eDP standard. It's a requirement in the eDP standard and is mainly used for power saving features. It's only since the release of Adaptive sync in desktops monitors that features that have been in the eDP spec for years are been turned to gaming.
I will repeat myself, just because Nvidia's Pascal chips work in laptops without the module doesn't mean they will work with adaptive sync monitors. There is no requirements for Nvidia to make their desktop cards compatible with adaptive sync, but, their is a requirement if they want to use their chips in laptops.
And it's Nvidia who are increasing the price of the Gsync module. According to Nvidia themselves the module costs $500 for the new HDR 144Hz Gsync monitors.
So if Pascal based chips work in Gsync laptops, they must have the required Adaptive sync hardware, because there is certainly no Gsync module.
Also this Gsync HDR module seems to get more expensive every time it gets mentioned.![]()
Here is article link where pcper talk about price breakdown. Says chip is 2600$ plus 3gb of ddr4.
https://www.pcper.com/reviews/Graph...z-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea
While there's not a whole lot we can glean from the specs of the FPGA itself, it starts to paint a more clear picture of the current G-SYNC HDR situation. While our original speculation as to the $2,000 price point of the first G-SYNC HDR monitors was mostly based on potential LCD panel cost, it's now more clear that the new G-SYNC module makes up a substantial cost.
It's an unstocked item, without a large bulk quantity price break, but you can actually find this exact same FPGA on both Digikey and Mouser, available to buy. It's clear that NVIDIA isn't paying the $2600 per each FPGA that both sites are asking, but it shows that these are not cheap components in the least. I wouldn't be surprised to see that this FPGA alone makes up $500 of the final price point of these new displays, let alone the costly DDR4 memory.
nVidia are not silly.
I would imagine whatever they choose to do their upcoming hardware will allow them the option to use VESA VRR technologies without GSYNC. It'll just be disabled at a firmware or driver level.
They make money off the Gsync modules that are sold to monitor manufacturers, which in term this extra cost to the manufacturers are passed onto the buyers om the price premium that they have to pay for Gsync monitors.
It's like probably similar to how why the Intel motherboard are always more expensive comparing to AMD motherboard- because Intel can command a higher price for the chipsets that are used by board manufacturers than AMD can.
I imagine it'll take a market swing to persuade Nvidia to start supporting some form of adaptive-sync like the heavily rumoured FreeGee* support & sadly that's likely to be another gen away at best.
*You heard it here first, second at WCCFtechActually I made that up, But it sounds a lot better than G-sync litehonest.
![]()
Market swing or market awareness that TV's are supporting VRR might shift it, I mean bru gets in a fizz resorting in Panos getting in a twist when when the kitchen sink is mentioned, so awareness is out there already.
could use a young John Travolta white suit intro type vid for marketing when they eventually take the plunge.
What an intro for Free G- Clasic night fever monitor dancing to the Bee Gees it would blow FreeSync awareness out the water overnight.![]()
Link for that please, because I certainly cannot find NVidia saying anything of the sort.
And if you really think that Nvidia cannot get their chips to work with Adaptive-Sync monitors seeing as they already do it with the same chips in the laptops, then you are living in lala land.
I am sorry, I actually thought I said PCPer and I also apologise because sometimes when I am in a hurry I write so quickly that I leave out words or mix up my sentences. I meant to say that it was Nvidia themselves increasing the cost of the Module according to PCPer. Sure they are only estimating the cost, but, it's going to be a lot more than their previous module.
Please point out to me where I said that Nvidia weren't capable of getting their cards to work with Adaptive sync? I never said anything of the sort. I said you can't use their laptop GPUs as proof that their Desktop GPUs have the hardware needed to connect to an adaptive sync monitor. GPU's going into a laptop have a different set of requirements that those in a desktop. VRR has been part of the eDP spec for years, if you want to put a GPU into a laptop you have to adhere to that specification. But, if you are building a desktop GPU, you don't need to make it compatible with adaptive sync as it's only an optional standard not a requirement like laptops.
Nvidia current Gsync HDR module is clearly not a long term solution though, a dedicated ASIC will handle this in the future. Although I also doubt he $500 estimate since Asus sells a top of the line Gsync HDR 4K 144Hz monitor for $2000.
Please point out to me where I said that Nvidia weren't capable of getting their cards to work with Adaptive sync? I never said anything of the sort. I said you can't use their laptop GPUs as proof that their Desktop GPUs have the hardware needed to connect to an adaptive sync monitor. GPU's going into a laptop have a different set of requirements that those in a desktop. VRR has been part of the eDP spec for years, if you want to put a GPU into a laptop you have to adhere to that specification. But, if you are building a desktop GPU, you don't need to make it compatible with adaptive sync as it's only an optional standard not a requirement like laptops.