• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

So Nvidia fans, are you once again happy to be charged extra for something monitor makers already fully intended to support FOR FREE, and that only Nvidia hardware buyers will be forced into the extra cost or locked out via drivers for the product you've paid for?

Well Anand said it didn't look as good as what Nvidia have shown so far, and you can bet your ass this was an absolute current best case scenario from AMD.

Which is better bet. A more expensive but better solution, or cheaper but worse? You pay your money and make your choice. Much like 3dvision mentioned in the OP. Was it more expensive and locked down by Nvidia, most certainly. Did it have better software support and be a more user friendly solution? Yes. I'll certainly take this from AMD, but I just hope it gets proper support and isn't just a look me too attempted spoiler.

I expect this to be at least 10 pages by tomorrow morning (given the fox news style fair and balanced op), with the usual manchildren having the same tired old arguments over yet another thread.
 
Last edited:
I'm thinking you know, if I win the lottery I'd buy one. I might buy a case like that though if I see one and make a little Kaveri computer.

I'm not keen on the case myself, but I'll admit that it's interesting and it does help to reduce the noise apparently. We may see cases of this style being released in the near future due to this release by Apple - although it looks like it could be quite difficult to do a build in a case like this.

They all work for Jen Hsun apparently:rolleyes:

Lol, of course, how could I forget that? :D
 
Another Nvidia vs AMD "debate"

Way I see it is we all win either way. It did take Nvidia to do GSync for AMD to announce their Freesync though.
I have AMD but have always liked Nvidia's stuff too. Only thing with Nvidia is the good stuff comes at a price whereas AMD get there in the end but don't charge. As with anything, if you want it now you have to pay.

Hope AMDs version is good and can be used on a lot of screens.
 
Well Anand said it didn't look as good as what Nvidia have shown so far, and you can bet your ass this was an absolute current best case scenario from AMD.

Which is better bet. A more expensive but better solution, or cheaper but worse? You pay your money and make your choice. Much like 3dvision mentioned in the OP. Was more expensive and locked down by Nvidia, most certainly.

Software support for 3dvision and nothing to do with the technology difference, it's a 120hz screen and a pair of glasses that sync to the 120hz, nothing more or less, being locked out of 3d mode based on the screen device id, there was nothing different. The software stack doesn't change the hardware having zero difference, either it paid the 3dvision branding charge, or it didn't.

Likewise ANandtech did NOT say g-sync was better, it said Nvidia had a better demo, nothing more or less. They also said specifically the biggest downside to g-sync was lack of monitor support and freesync appears to be down the path to fixing that.

This was also two notebooks with Kabini's in, not a 780gtx being used. It was existing retail bought laptops that supported the feature.

I'm sure Nvidia and AMD will improve their drivers for it in the future, but they didn't say the feature was worse than g-sync only that the demo wasn't as good. It was run on vastly different hardware and very different screens.

Nvidia won't have an extra cost for hardware, or better hardware, they won't continue to add a $150 cost through a FPGA the monitor makers certainly don't want to use long term.

FPGA is essentially a slower custom chip you can program to act(to a degree) in any way you like. It will use vastly more power, hugely more transistors than a fixed function silicon solution. The difference is using an existing FPGA and programming it is just stupidly faster than waiting for a full product tape out and chips to be ready. My opinion then, and even more so now(because there is essentially no other reason to both use a FPGA here nor stick it onto and replace an existing chip in the existing monitor) than time to market, they did this to get monitors out first and to be able to lock them in.

In the future when given the choice of that final silicon at a fraction of the price and power usage being available vs the expensive FPGA, it's a non contest.

Future screens will have this feature as standard in the normal controller that costs say $5 a screen and won't use a $100 FPGA instead. In the future Nvidia will be using standard hardware to use this feature. They could, possibly not lock Nvidia users to g-sync branded screens, more chance of it if their users kick up a fuss. As with 3dvision/sli mobo's/other stuff, they will likely continue to charge the likes of Asus say $10 a screen to be branded compatible and not locked out via drivers.
 
Maybe it's down to my interpretation of his words, but that looked like an 'I'm not wearing it' type of comment, he had quite a few posts early on in the thread including stating that Gsync was only being demo'd running on a 760 iirc-which is along my train of thought, high end gpu's don't need it imo.

Never followed the thread too much but he was generally negative towards Gsync from what I remember.

Gibbo hadn't actually seen Gsync though at the point he was talking about it. I'm not sure if he has now or not.

Not sure for me personally if this is something I'm going to be interested in either way to be honest. I need to see it and I'm not going to pay for a kit for my existing monitor and I'm not going to get rid of my Dell monitor for it. Monitors are like 5 year investments people don't chop and change them but it may be something I look at when I next buy a monitor. I shan't be getting rid of either of mine though.

Gibbo didn't more or less say it was a con though. He will be stocking Gsync monitors after all which wouldn't be in line with a con :p. I think he is being negative about it initially purely because he isn't getting any Asus Gysnc monitors from day one. Sales people will generally be like that. :)

Guilty as well, but sometimes you read stuff on here that makes you want to bang your head against the screen.

Ha ha. Too true.

Likewise ANandtech did NOT say g-sync was better, it said Nvidia had a better demo, nothing more or less. They also said specifically the biggest downside to g-sync was lack of monitor support and freesync appears to be down the path to fixing that.

Well that's kind of what people have to go on at the moment. I don't believe for a second the kind of crap that gets briefed at conferences. I like details and real things which I can perceive or touch. Of course that comes later down the line :).
 
Last edited:
It's all about the traffic flowing your way baby, play ball and go with the flow...

Gullible?

Some yes, the astute ones however know how the game plays.;):D

So no chance they actually thought it was impressive? I think it is great that AMD are embracing this monitor refresh but don't for one minute think they would have jumped on this if nVidia hadn't already done it.
 
So no chance they actually thought it was impressive? I think it is great that AMD are embracing this monitor refresh but don't for one minute think they would have jumped on this if nVidia hadn't already done it.

So they've had it in drivers waiting on monitor makers....... but were never going to do it till Nvidia launched it? Nvidia launched a way to do variable refresh rate using an ..... industry standard method provided which screens haven't adopted yet, but Nvidia was the only one that was going to use it even though it's going to be absolutely bog standard in mobile devices for power saving and AMD/Intel were going to utterly ignore this?

Tosh, it's fairly clear that this was coming. Lets say everyone in the industry knew this was coming in June as standard on all new monitors launched after June, Nvidia decided, lets do it in FPGA(quicker, vastly more expensive) and launch early.

If they didn't do this, AMD,Nvidia, Intel would all have supported it in June anyway. As also shown during the g-sync threads, AMD people, and many people have talked about the need for something better than v-sync for seemingly years.
 
Well Anand said it didn't look as good as what Nvidia have shown so far, and you can bet your ass this was an absolute current best case scenario from AMD.Which is better bet. A more expensive but better solution, or cheaper but worse? You pay your money and make your choice. Much like 3dvision mentioned in the OP. Was it more expensive and locked down by Nvidia, most certainly. Did it have better software support and be a more user friendly solution? Yes. I'll certainly take this from AMD, but I just hope it gets proper support and isn't just a look me too attempted spoiler.

I expect this to be at least 10 pages by tomorrow morning (given the fox news style fair and balanced op), with the usual manchildren having the same tired old arguments over yet another thread.

Where in the link do they say this? because I not find where they say it's not has good.

Do you mean this? "AMD’s demo isn’t quite as nice as NVIDIA’s swinging pendulum"
Because this isn't aimed at the sync performance but more that the Pendulum is of a higher quality Demo.
 
@DM

Many many words. None of which respond to what I actually said now that's impressive.

I'll bite though. You rant about the hardware being the same but having to pay thei Nvidia tax.... Well yes, that's how many things work. How much hardware is out there that us feature limited by the software? Limited by what features you pay for. Like I said. If you paid the extra for 3DV you got a superior solution. For some reason this causes you much anger.

I'll leave this thread now and follow the sensible chaps on B3D. You can bet it will have a far larger signal to noise ratio minus all the tedious point scoring and brand cheerleaders.
 
Last edited:
http://translate.googleusercontent....t.html&usg=ALkJrhglN8k7pgIe-Tmm9MZJ-iVFlrnkag

In our conversations with AMD, we of course discussed the G-SYNC. Is AMD is ready to propose an alternative. With a big smile while AMD has launched a demonstration on two phones, one equipped with a variable frequency refreshment, as does G-SYNC.

To our surprise, AMD then explained to us that this display technique was already in development for some time, but mainly for other reasons: to reduce consumption display related to mobile systems. AMD also specifies a standard VESA already exists for this frame and the consortium was very surprised at the announcement of G-SYNC by Nvidia.

So many tiles already would support the frequency of variable cooling via a variable period VBLANK (exactly what G-SYNC so). If the email linked to the display supports it, which is not a problem according to AMD, then it is sufficient that the display engine of the GPU and drivers to do the same. AMD says that its recent GPU are capable and software support is also ready even if the access interface is not yet in place.

AMD has yet designed a small demo that shows all in action and it works just as well as priori G-SYNC, no extra hardware, no additional extra cost! AMD drives the point facing Nvidia stating that they do not intend to do so to add an additional cost to operate a technology that is already a standard. Why they call him now FreeSync.

AMD does not yet know when or how this display technology will be offered to users Radeon or which screens actually incorporate this possibility. AMD seems to have been surprised by G-SYNC-like according to its leaders a way for Nvidia to earn a few months on the timing of arrival of a technology standard expected.

It is not impossible that a firmware update can make this support on some screens already on sale, then this should be the case at no extra cost on future screens. Moreover, this approach allows a fairly simple use in the mobile world, which could benefit more quickly.

This is new in our case and is of course a big surprise. We are trying to learn more and our current assumption is that the Nvidia GPU may not directly support this possibility, which would explain the use of G-SYNC module.

It seems the VESA consortium were surprised by what Nvidia did!
 
So no chance they actually thought it was impressive?

They very well may have thought so, but when you are on the 'good guy' list and get invited to the Nvidia tech show that unveiled Gysnc, you aren't going to say 'what a lot of *****' are you unless you want put on nvidia's naughty list.;)

I think it is great that AMD are embracing this monitor refresh but don't for one minute think they would have jumped on this if nVidia hadn't already done it.

They are only jumping on it for the simple reason-paying for it is a bump in their view.

The underlying tech has been available@the driver level via software, it probably wasn't important until you could show up your competitor for taking their customers for a ride.
 

Yes they were, I wasn't, there was a standard coming, it was going to be supported by everyone on every monitor, so Nvidia came along, branded their own version and decided to massively increase the costs, only for their own users..... to get it a couple months early, while also pretending they came up with it.

Vesa would be surprised that someone would take what was a standard feature and pretend it was their own, anyone who has ever read anything about Nvidia should absolutely not be surprised though, it's what they do.

Layte, it's okay, you seemingly purposefully misinterpreted what Anandtech said, then mentioned 3dvision hardware as a reason the software was different, and decided me responding directly to both these points wasn't relevant....... a sure sign that you have no argument, I'll miss you.
 
I was going to quit reading GC for good because the same old posters cannot present a reasoned and factual post on ANYTHING to do with either AMD or nVidia without throwing in a truck load of flame bait, opinion and worst of all their raging hard on at bashing the opposing side.

Rather than quit reading my ignore list just expanded, including the OP - Think of all of that saved GPU power not having to render all those words! Excellent.
 
Back
Top Bottom