• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

A Week With NVIDIA's G-SYNC Monitor

just pointing out again - DM says you can't patent the idea of a GPU dynamically updating and telling a monitor what the refresh rate should be;

I had a spare few minutes and managed to find this;

http://patft.uspto.gov/netacgi/nph-...h&OS=nvidia+AND+refresh&RS=nvidia+AND+refresh

basically, it's a patent on looking at a display buffer and using the condition that it is updated by a command source (GPU or CPU) and then adjusting the monitors refresh rate to match that command source

if you search for NVidia and refresh there are quite a lot of hits, this was like the 3rd or 4th on the list and I couldn't be bothered to look at others, there might be more that could also relate to gsync

so much for NVidia not being able to patent the idea of a graphics card telling the monitor when to refresh, though having the patent also obliges them to licence it if it is deemed to be essential to the industry

USPTO says otherwise
 
Why would Nvidia do this? Surely it will result in loss of money as they won't be selling the more powerful cards, most people will now get mid tier as it will be smooth regardless.

on some games and some resolutions even top tier cards dip below whatever ideal fps you've set for yourself... gsync means leaving those settings turned up and still getting smooth frames

just look at metro last light, it's a killer on pretty much any graphics card
 
Why would Nvidia do this? Surely it will result in loss of money as they won't be selling the more powerful cards, most people will now get mid tier as it will be smooth regardless.

This is true and something that Jen mentioned in the Montreal conference. No need to splash out on the top cards if you have a G-Sync monitor and a Kepler card or above :)

Would I go mid range? Probably not as I love benching but it was mentioned that it can make 45 fps feel 50% faster. That is not to be sniffed at.
 
No, the R&D cost on this is miniscule, Nvidia have decided to make effectively their own, exceptionally expensive monitor controller chip and past that cost on to their customers. Monitor makers WILL integrate the incredibly basic tech into their future controllers with essentially no additional cost.

In the future this means all monitors support it, no need to licence anything, AMD just send out frames as they are ready with a tiny bit of frame pacing, this is trivial stuff.

The only issue is the wait for monitor makers to get their new asics and launch a new model. Designing and releasing an asic isn't a 3 month process(but we have no idea when they started).

The only reason AMD would want to licence g-sync would be to have access to the insanely expensive Nvidia fpga chip containing monitors, for the future integrated monitors Nvidia can't do **** to stop AMD or monitor makers enabling this mode in any other non trademarked name(and if they get together to call it free-sync I may just die laughing).

Nvidia users will likely continue to be charged extra for g-sync branded screens..... re 3dvision, screens do nothing fancy or different to any other 3d screens, Nvidia don't do anything different with 3d, Nvidia can't patent 3d, yet they trademark 3dvision and lock out their own customers from free screen(all with identical technology) choice and only allow those screens which pay to be 3dvision branded to work with the driver.

Nvidia likely have a short term time to market advantage due to their fpga's, good for them, long term, AMD users WILL have the same thing for free on ANY screen including future g-sync screens(same way I can use a 3dvision branded screen for 3d with an AMD card, because these future monitors will no longer have the fpga made and controlled by Nvidia, but the normal standard controller asic.

So long term I see it almost identical to 3d/3dvision. It works on any 3d screen for AMD, Nvidia will lock out non paying screens. I can't see a single reason this won't be the case longer term, the short term fpga solution is a clever one, but is expensive and can't possibly be a long term solution by any sensible monitor maker. No monitor makers want to design a screen then have a version which bypasses in built pieces and uses an external extra and overly expensive chip.

Hi Charlie ;)

Why has AMD not implemented this 'trivial stuff' up to now ?
 
Hi Charlie ;)

Why has AMD not implemented this 'trivial stuff' up to now ?

Why hasn't Nvidia done it till now. Why did AMD only just make up a 4k standard and hand it to Vesa, people have been making 4k screens for a while now without a standard.

It's amazing what the industry just won't bother doing when someone doesn't grab everyone by the balls and drag them kicking and screaming to the next innovation.

There was nothing remotely difficult about a Vesa standard for 4k, it could and should have been done 5 years ago..... that usually has little relevance.

As for the Charlie remarks, it's unsurprising when they pop up whenever someone doesn't kiss Nvidia's ass, but uses knowledge, history, Nvidia's own track record to say what they think will happen.

Is it untrue what I said about 3dvision? Is it not something on every 3d screen and do Nvidia not lock out their own users from using it unless the screen making pays to certify it as 3dvision despite precisely no hardware difference at all? Did they not do this for years with SLI till they got shoved out of the chipset game? Do Nvidia not have a history of when something becomes standard, finding a way to charge their own users to use a feature they advertise? If you can honestly say Nvidia doesn't do these things, good for you.

If they are in fact true, and it screams of the same situation coming up in the future with g-sync, then how am I "being Charlie" by pointing this out exactly? Exactly what did I say that was negative or inaccurate, completely unlikely to happen?
 
Its pure coincidence that DMs posts on here took a massive anti nvidia swing at the same time charlie's website did, and that DM started linking to it

It amazes me that this tech is written off so readily and AMD can knock this up free of charge to its customers but the question which PGI asked earlier just got ignored and we have another rant instead.

This could be rubbish and expensive and AMD can do a better job with Mantle however, I won't write it off till I see it for myself. I don't feel all the hype surrounding G-Sync from those who have seen and used it would be so high if it wasn't good.

I just see this as another "Charlie hates Nvidia" rant
 
Its pure coincidence that DMs posts on here took a massive anti nvidia swing at the same time charlie's website did, and that DM started linking to it

Which thing did I link to on Charlie's website precisely, and which anti Nvidia rant did he just do? Also be honest, which part of my post took a massive anti Nvidia swing?

I said what I think will happen, nothing more, nothing less. Nvidia has a history of doing what I think they will do, hence when I think they will do it. I asked in that post, which none of you answered, has Nvidia not done precisely what I said with 3dvision, sli, is it therefore insane to believe they will do it with another technology?

Don't answer a simple question, and simply post with "he's being anti Nvidia because he thinks something I don't like will happen".

If Nvidia did exactly what I said with 3dvision and I think based on this they will do exactly the same thing long term with g-sync where is that unreasonable?

More to the point, do you really think, again that monitor makers will continue to use a $100+ fpga, that both will be produced in low volume, cost huge amounts and use more power, when they can given time, make an asic, a single asic like every current monitor, that can do the same thing, cost a fraction of the price, use a fraction of the power, and not require making a screen with a certain amount of controller circuitry, chips and cost... which all get bypassed? As such at some point in the future the fpga will go, it will be added to existing asics. When this happens(and I can't believe a single sensible person thinks this wouldn't happen) Nvidia won't have a hardware lock any more.... again look back at their past, how do they control their brand when there is no physical hardware lock?
 
Back
Top Bottom