• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia and freesync, hypothetical.

So if Pascal based chips work in Gsync laptops, they must have the required Adaptive sync hardware, because there is certainly no Gsync module.

Also this Gsync HDR module seems to get more expensive every time it gets mentioned.:rolleyes:
 
Yeah I notice this too. I can't see NV giving these $500/1000/2500 processors away at a loss.

Dunno if the HDR module is any different and obviously R&D costs, etc. but last time I looked at it I could source all the component parts for the G-Sync FPGA for single digit $ and I bet nVidia gets a better deal than I do at the volumes they'd be able to order.
 
So if Pascal based chips work in Gsync laptops, they must have the required Adaptive sync hardware, because there is certainly no Gsync module.

Also this Gsync HDR module seems to get more expensive every time it gets mentioned.:rolleyes:

As I said in my reply you already. Laptops already have this functionality through the eDP standard. It's a requirement in the eDP standard and is mainly used for power saving features. It's only since the release of Adaptive sync in desktops monitors that features that have been in the eDP spec for years are been turned to gaming.

I will repeat myself, just because Nvidia's Pascal chips work in laptops without the module doesn't mean they will work with adaptive sync monitors. There is no requirements for Nvidia to make their desktop cards compatible with adaptive sync, but, their is a requirement if they want to use their chips in laptops.

And it's Nvidia who are increasing the price of the Gsync module. According to Nvidia themselves the module costs $500 for the new HDR 144Hz Gsync monitors.
 
As I said in my reply you already. Laptops already have this functionality through the eDP standard. It's a requirement in the eDP standard and is mainly used for power saving features. It's only since the release of Adaptive sync in desktops monitors that features that have been in the eDP spec for years are been turned to gaming.

I will repeat myself, just because Nvidia's Pascal chips work in laptops without the module doesn't mean they will work with adaptive sync monitors. There is no requirements for Nvidia to make their desktop cards compatible with adaptive sync, but, their is a requirement if they want to use their chips in laptops.

And it's Nvidia who are increasing the price of the Gsync module. According to Nvidia themselves the module costs $500 for the new HDR 144Hz Gsync monitors.

Link for that please, because I certainly cannot find NVidia saying anything of the sort.

And if you really think that Nvidia cannot get their chips to work with Adaptive-Sync monitors seeing as they already do it with the same chips in the laptops, then you are living in lala land.
 
So if Pascal based chips work in Gsync laptops, they must have the required Adaptive sync hardware, because there is certainly no Gsync module.

Also this Gsync HDR module seems to get more expensive every time it gets mentioned.:rolleyes:

The module is expensive. The FPGA chip used costs $2600, the guess of the guy who first posted the article was that maybe Nvidia buys it as low as $500, but the components of the board cost another $250-300. If you add VAT in UK this price is between £730 and much more.
 
Ah they are using a newer Altera FPGA with the HDR one :( the older one is cheap as uh chips these days.

Bit of a joke IMO as the HDR I've seen from those setups so far didn't convince me at all after a little while I could see how it was working and then couldn't "unsee" that which completely ruined it.
 
Here is article link where pcper talk about price breakdown. Says chip is 2600$ plus 3gb of ddr4.
https://www.pcper.com/reviews/Graph...z-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea

so we are going with this then.

While there's not a whole lot we can glean from the specs of the FPGA itself, it starts to paint a more clear picture of the current G-SYNC HDR situation. While our original speculation as to the $2,000 price point of the first G-SYNC HDR monitors was mostly based on potential LCD panel cost, it's now more clear that the new G-SYNC module makes up a substantial cost.

It's an unstocked item, without a large bulk quantity price break, but you can actually find this exact same FPGA on both Digikey and Mouser, available to buy. It's clear that NVIDIA isn't paying the $2600 per each FPGA that both sites are asking, but it shows that these are not cheap components in the least. I wouldn't be surprised to see that this FPGA alone makes up $500 of the final price point of these new displays, let alone the costly DDR4 memory.

yup very comprehensive and clearly NVidia saying that it is NVidia increasing the price of the unit.:rolleyes:
 
nVidia are not silly.

I would imagine whatever they choose to do their upcoming hardware will allow them the option to use VESA VRR technologies without GSYNC. It'll just be disabled at a firmware or driver level.

They make money off the Gsync modules that are sold to monitor manufacturers, which in term this extra cost to the manufacturers are passed onto the buyers om the price premium that they have to pay for Gsync monitors.

It's like probably similar to how why the Intel motherboard are always more expensive comparing to AMD motherboard- because Intel can command a higher price for the chipsets that are used by board manufacturers than AMD can.

I imagine it'll take a market swing to persuade Nvidia to start supporting some form of adaptive-sync like the heavily rumoured FreeGee* support & sadly that's likely to be another gen away at best.

*
Actually I made that up, But it sounds a lot better than G-sync lite :rolleyes: honest.:D
You heard it here first, second at WCCFtech :D
 
Last edited:
I imagine it'll take a market swing to persuade Nvidia to start supporting some form of adaptive-sync like the heavily rumoured FreeGee* support & sadly that's likely to be another gen away at best.

*
Actually I made that up, But it sounds a lot better than G-sync lite :rolleyes: honest.:D
You heard it here first, second at WCCFtech :D

Market swing or market awareness that TV's are supporting VRR might shift it, I mean bru gets in a fizz resorting in Panos getting in a twist when when the kitchen sink is mentioned, so awareness is out there already.

I'm afraid 'Free G' has been mentioned aaaaaaggggggesss ago in here mate but FreeGee is a new one, could use a young John Travolta white suit intro type vid for marketing when they eventually take the plunge.

What an intro for Free G- Clasic night fever monitor dancing to the Bee Gees it would blow FreeSync awareness out the water overnight.:D
 
Market swing or market awareness that TV's are supporting VRR might shift it, I mean bru gets in a fizz resorting in Panos getting in a twist when when the kitchen sink is mentioned, so awareness is out there already.


For gods sake get it right, I get in a twist and Panos gets in a fizz, I dunno some people can't even get the simplest things straight. ;)
 
could use a young John Travolta white suit intro type vid for marketing when they eventually take the plunge.

What an intro for Free G- Clasic night fever monitor dancing to the Bee Gees it would blow FreeSync awareness out the water overnight.:D

You're bang on, That would make for a good campaign. :D
 
Link for that please, because I certainly cannot find NVidia saying anything of the sort.
I am sorry, I actually thought I said PCPer and I also apologise because sometimes when I am in a hurry I write so quickly that I leave out words or mix up my sentences. I meant to say that it was Nvidia themselves increasing the cost of the Module according to PCPer. Sure they are only estimating the cost, but, it's going to be a lot more than their previous module.

And if you really think that Nvidia cannot get their chips to work with Adaptive-Sync monitors seeing as they already do it with the same chips in the laptops, then you are living in lala land.

Please point out to me where I said that Nvidia weren't capable of getting their cards to work with Adaptive sync? I never said anything of the sort. I said you can't use their laptop GPUs as proof that their Desktop GPUs have the hardware needed to connect to an adaptive sync monitor. GPU's going into a laptop have a different set of requirements that those in a desktop. VRR has been part of the eDP spec for years, if you want to put a GPU into a laptop you have to adhere to that specification. But, if you are building a desktop GPU, you don't need to make it compatible with adaptive sync as it's only an optional standard not a requirement like laptops.

 
Last edited:
I am sorry, I actually thought I said PCPer and I also apologise because sometimes when I am in a hurry I write so quickly that I leave out words or mix up my sentences. I meant to say that it was Nvidia themselves increasing the cost of the Module according to PCPer. Sure they are only estimating the cost, but, it's going to be a lot more than their previous module.



Please point out to me where I said that Nvidia weren't capable of getting their cards to work with Adaptive sync? I never said anything of the sort. I said you can't use their laptop GPUs as proof that their Desktop GPUs have the hardware needed to connect to an adaptive sync monitor. GPU's going into a laptop have a different set of requirements that those in a desktop. VRR has been part of the eDP spec for years, if you want to put a GPU into a laptop you have to adhere to that specification. But, if you are building a desktop GPU, you don't need to make it compatible with adaptive sync as it's only an optional standard not a requirement like laptops.


Nvidia current Gsync HDR module is clearly not a long term solution though, a dedicated ASIC will handle this in the future. Although I also doubt he $500 estimate since Asus sells a top of the line Gsync HDR 4K 144Hz monitor for $2000.

I imagine when this becomes mainstream Nvidia will have an ASIC based module for $20-30. The current GSync module is estimated to cost somehtign like $7-8 which nvidia sells to the monitor company for about $30-40 which includes the license fee, which is the true Gsync tax.

It is a bit foolish to think that a $500 module is at all realistic long term. Gsync will die very quickly. if that was true.
 
Nvidia current Gsync HDR module is clearly not a long term solution though, a dedicated ASIC will handle this in the future. Although I also doubt he $500 estimate since Asus sells a top of the line Gsync HDR 4K 144Hz monitor for $2000.

You doubt the $500 estimate? The part costs $2000 if you were to buy it yourself. Nvidia paying $500 for the module seems pretty realistic to me. Now, add the add the cost of the 3GB of memory and the whatever Nvidia charge to program the FPGA. It would be very surprising if the monitor manufacturer was paying Nvidia less than $700 for it.

$2000 is expensive for a 27 inch monitor.

as for current Gsync tax been $30-$40 haha yes, good joke.

And I didn't make any claims to what the long term plans of Nvidia are.
 
Please point out to me where I said that Nvidia weren't capable of getting their cards to work with Adaptive sync? I never said anything of the sort. I said you can't use their laptop GPUs as proof that their Desktop GPUs have the hardware needed to connect to an adaptive sync monitor. GPU's going into a laptop have a different set of requirements that those in a desktop. VRR has been part of the eDP spec for years, if you want to put a GPU into a laptop you have to adhere to that specification. But, if you are building a desktop GPU, you don't need to make it compatible with adaptive sync as it's only an optional standard not a requirement like laptops.

They are the sane chips. :rolleyes:
 
Back
Top Bottom