• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia and freesync, hypothetical.

Man of Honour
Joined
13 Oct 2006
Posts
91,040
You realise there's things that they can't physically tear down, right? It doesn't run on orange juice.

Not sure your point? people have opened up G-Sync monitors and identified the circuitry which is based around an off the shelf FPGA and a few other readily available parts which is available to other companies to order (I mean people can compare the reference product(s) to the nVidia implementation to see any changes it isn't like there is some mystery black box or dark arts involved) - there is minimal customised or propitiatory circuitry its mostly reprogramming of the FPGA that provides the G-Sync functionality.

The hardware costs as far as bill of materials go is easy enough to compile - what can't really be guessed at is things like the software development costs and ongoing software support, etc.

EDIT: Obviously this is talking about the existing G-Sync module and not the new HDR variant.
 
Soldato
Joined
22 Nov 2006
Posts
23,356
If geforce supported freesync then G-sync would die, no question about it.

They are artificially keeping it alive by making sure adaptive sync doesn't work on their cards. As someone who owns both, I don't see a reason why gsync is worth the extra £100+. It's really obsolete tech which your forced to use if you buy nvidia.
 
Last edited:
Soldato
Joined
5 Sep 2011
Posts
12,812
Location
Surrey
Not sure your point? people have opened up G-Sync monitors and identified the circuitry which is based around an off the shelf FPGA and a few other readily available parts which is available to other companies to order (I mean people can compare the reference product(s) to the nVidia implementation to see any changes it isn't like there is some mystery black box or dark arts involved) - there is minimal customised or propitiatory circuitry its mostly reprogramming of the FPGA that provides the G-Sync functionality.

The hardware costs as far as bill of materials go is easy enough to compile - what can't really be guessed at is things like the software development costs and ongoing software support, etc.

EDIT: Obviously this is talking about the existing G-Sync module and not the new HDR variant.

The point is, you're not qualified to comment on how it works simply because you've read just enough to break down what it's comprised of. There's a lot happening at the display output that you don't understand.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,040
The point is, you're not qualified to comment on how it works simply because you've read just enough to break down what it's comprised of. There's a lot happening at the display output that you don't understand.

Still not sure the point? - the context is the material cost of the hardware to nVidia - what is and isn't happening at the display output is entirely irrelevant to that.

You also have little to no idea what I am and am not qualified or experienced on.
 
Soldato
Joined
5 Sep 2011
Posts
12,812
Location
Surrey
Still not sure the point? - the context is the material cost of the hardware to nVidia - what is and isn't happening at the display output is entirely irrelevant to that

Only it's not, as that would mean you're taking things such as the licensing at face value based on purely the cost of the materials. What is happening at the display output is arguably entirely more relevant. NVIDIA is large and part a software firm, and everything should be considered when one is trying to justify everything, licensing included.

I can tell by the way you're approaching this that you're not qualified. So I've got a better idea of *that* than you do of this topic.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,040
Only it's not, as that would mean you're taking things such as the licensing at face value based on purely the cost of the materials. What is happening at the display output is arguably entirely more relevant. NVIDIA is large and part a software firm, and everything should be considered when one is trying to justify everything, licensing included.

I can tell by the way you're approaching this that you're not qualified. So I've got a better idea of *that* than you do of this topic.

I qualified the exclusion of the cost of software development including licensing from my original post - largely the hardware used is off the shelf solutions that are generally available without any significant costs involved in the licensing of supporting software if applicable though some of the development tools and the development kits for the FPGA can be a bit most costly but again I expressly was talking about the hardware cost.

Again you have no idea what I'm qualified or experienced with or not.

EDIT: Actually in the first post you commented I didn't touch on the software side - but if you go back a bit with some of my older posts on the subject and the post I replied to you with I did.

PS: Instead of being all I know better than you about it why not chip in on threads like this https://forums.overclockers.co.uk/threads/do-current-nvidia-cards-have-onboard-sound.18825994/ where your superior knowledge of embedded solutions would be useful.
 
Last edited:
Soldato
Joined
5 Sep 2011
Posts
12,812
Location
Surrey
I qualified the exclusion of the cost of software development including licensing from my original post - largely the hardware used is off the shelf solutions that are generally available without any significant costs involved in the licensing of supporting software if applicable though some of the development tools and the development kits for the FPGA can be a bit most costly but again I expressly was talking about the hardware cost.

Again you have no idea what I'm qualified or experienced with or not.

EDIT: Actually in the first post you commented I didn't touch on the software side - but if you go back a bit with some of my older posts on the subject and the post I replied to you with I did.

Not sure your point? people have opened up G-Sync monitors and identified the circuitry which is based around an off the shelf FPGA and a few other readily available parts which is available to other companies to order (I mean people can compare the reference product(s) to the nVidia implementation to see any changes it isn't like there is some mystery black box or dark arts involved) - there is minimal customised or propitiatory circuitry its mostly reprogramming of the FPGA that provides the G-Sync functionality.

You weren't very sure here. In fact, this whole post alludes to the fact there's nothing of merit, and yet the alternative vendor implementation is still not quite as good. So evidently there is a point being muffled in translation here. That is to say what does it matter how much the ASIC and hardware costs, and how is it really relevant to the experience.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,040
You weren't very sure here. In fact, this whole post alludes to the fact there's nothing of merit, and yet the alternative vendor implementation is still not quite as good. So evidently there is a point being muffled in translation here. That is to say what does it matter how much the ASIC and hardware costs, and how is it really relevant to the experience.

I think you are talking at cross purposes - I was adding commentary on the hardware side hence using phrases like "hardware wise" in my response to melmac's post - I didn't touch on the licensing part of his post as I can only guess at what additional comes from other development costs while the hardware itself I can get quotations on.
 
Soldato
Joined
5 Sep 2011
Posts
12,812
Location
Surrey
That's fair enough, I was probably putting my own emphasis on your post as it just seemed as though you were downplaying the technology based on things like the origin of the module.
 
Soldato
Joined
19 Dec 2010
Posts
12,026
The original G-Sync module (people have done teardowns) is an off the shelf Altera FPGA with minor modification by nVidia hardware wise. I can source all the hardware for around $9/unit currently if buying in multiples of 5000.

Not sure the point you are making. I never claimed that the original Gsync module was anything more than $10. In the line you quoted from my post I even state it's the cost of module + Licence

I've not checked the latest one but I suspect people are getting the pricing mixed up with the development kit.

And you would suspect wrong. The price of the new part is easily got just the same as you did for the original Altera FPGA.


You'll have to go a lot warmer than that.

No doubt.

Technically speaking Drunkenmaster coined it.

That's what he would have you believe :p but, no, he didn't.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,040
Not sure the point you are making. I never claimed that the original Gsync module was anything more than $10. In the line you quoted from my post I even state it's the cost of module + Licence

I was merely adding commentary on the hardware side (hence as above talking about "hardware wise", etc.) it wasn't disagree with your post, etc. I can only guess at licensing costs as I have no idea what agreements, etc. nVidia might have in place (long term service agreements, etc. if they use the company on an ongoing basis) that might modify that rather than the normal licensing, support, etc. costs.

And you would suspect wrong. The price of the new part is easily got just the same as you did for the original Altera FPGA.

I don't know anywhere that has the actual board available I can only get the dev kit prices which are similar to the dev board pricing of previous ones ($4-8K) so I have no idea what these new ones actually cost hardware wise.
 
Soldato
Joined
19 Dec 2010
Posts
12,026
I was merely adding commentary on the hardware side (hence as above talking about "hardware wise", etc.) it wasn't disagree with your post, etc. I can only guess at licensing costs as I have no idea what agreements, etc. nVidia might have in place (long term service agreements, etc. if they use the company on an ongoing basis) that might modify that rather than the normal licensing, support, etc. costs.

Ah, ok. I actually thought you had misquoted me and you were actually replying to mid_gen.

I don't know anywhere that has the actual board available I can only get the dev kit prices which are similar to the dev board pricing of previous ones ($4-8K) so I have no idea what these new ones actually cost hardware wise.

The second module uses the Altera Arria 10 GX 480 FPGA. You can find them in several online distributers.

Just did a quick check, the Dev Kit costs $4500. The FPGA costs $2600.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,040
Just did a quick check, the Dev Kit costs $4500. The FPGA costs $2600.

I see Mouser actually has some in stock now at £900-1500 - few days ago nowhere I tried had them in stock [prices were just placeholders] and/or were actually the development kit. Nowhere I'd source stuff like that from for projects though has anything other than the development kit at the moment.
 
Soldato
Joined
19 Dec 2010
Posts
12,026
I see Mouser actually has some in stock now at £900-1500 - few days ago nowhere I tried had them in stock [prices were just placeholders] and/or were actually the development kit. Nowhere I'd source stuff like that from for projects though has anything other than the development kit at the moment.

Not the same part though. Here you go 10AX048H2F34E1HG. £1900 and that's not a placeholder price either. If you try to buy them, that's the price you are paying and it's 3 minimum.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,040
Have to say I'm pretty unimpressed given that what I've seen of HDR via these kind of solutions I personally found distinctly lacking - at first it seems quite nice but after awhile I couldn't un-notice how much trickery was being used making it a bit of a joke given the price for something that isn't a complete solution.
 
Associate
Joined
31 Oct 2012
Posts
2,240
Location
Edinburgh
Isn't it the case that Gsync will sync at lower frame rates than Adaptive Sync / Freesync? In which case there would still be market (albeit smaller) for those wanting that capability.
Nope they both can go as low a slideshow as you like in theory, and in practise both go to panel min then start frame doubling.

On first release adaptive sync didn't frame double surprisingly, but that's long since resolved.
 
Last edited:
Back
Top Bottom