I did watch that video but it doesn't look any smoother than my monitor when it hits high frames.
That's the point, it will be smooth when it's not hitting high frames
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I did watch that video but it doesn't look any smoother than my monitor when it hits high frames.
I had a spare few minutes and managed to find this;
http://patft.uspto.gov/netacgi/nph-...h&OS=nvidia+AND+refresh&RS=nvidia+AND+refresh
basically, it's a patent on looking at a display buffer and using the condition that it is updated by a command source (GPU or CPU) and then adjusting the monitors refresh rate to match that command source
if you search for NVidia and refresh there are quite a lot of hits, this was like the 3rd or 4th on the list and I couldn't be bothered to look at others, there might be more that could also relate to gsync
so much for NVidia not being able to patent the idea of a graphics card telling the monitor when to refresh, though having the patent also obliges them to licence it if it is deemed to be essential to the industry
http://international.download.nvidia.com/geforce-com/international/videos/NVIDIA_GSYNC_Product_Video.mp4 (set monitor to 60hz)
sold! well aslong as they get it working with surround anyway
http://international.download.nvidia.com/geforce-com/international/videos/NVIDIA_GSYNC_Product_Video.mp4 (set monitor to 60hz)
sold! well aslong as they get it working with surround anyway
That's the point, it will be smooth when it's not hitting high frames
Why would Nvidia do this? Surely it will result in loss of money as they won't be selling the more powerful cards, most people will now get mid tier as it will be smooth regardless.
Why would Nvidia do this? Surely it will result in loss of money as they won't be selling the more powerful cards, most people will now get mid tier as it will be smooth regardless.
Definitely something I have to see with my own eyes to see if it would be worth it over my current setup.
No, the R&D cost on this is miniscule, Nvidia have decided to make effectively their own, exceptionally expensive monitor controller chip and past that cost on to their customers. Monitor makers WILL integrate the incredibly basic tech into their future controllers with essentially no additional cost.
In the future this means all monitors support it, no need to licence anything, AMD just send out frames as they are ready with a tiny bit of frame pacing, this is trivial stuff.
The only issue is the wait for monitor makers to get their new asics and launch a new model. Designing and releasing an asic isn't a 3 month process(but we have no idea when they started).
The only reason AMD would want to licence g-sync would be to have access to the insanely expensive Nvidia fpga chip containing monitors, for the future integrated monitors Nvidia can't do **** to stop AMD or monitor makers enabling this mode in any other non trademarked name(and if they get together to call it free-sync I may just die laughing).
Nvidia users will likely continue to be charged extra for g-sync branded screens..... re 3dvision, screens do nothing fancy or different to any other 3d screens, Nvidia don't do anything different with 3d, Nvidia can't patent 3d, yet they trademark 3dvision and lock out their own customers from free screen(all with identical technology) choice and only allow those screens which pay to be 3dvision branded to work with the driver.
Nvidia likely have a short term time to market advantage due to their fpga's, good for them, long term, AMD users WILL have the same thing for free on ANY screen including future g-sync screens(same way I can use a 3dvision branded screen for 3d with an AMD card, because these future monitors will no longer have the fpga made and controlled by Nvidia, but the normal standard controller asic.
So long term I see it almost identical to 3d/3dvision. It works on any 3d screen for AMD, Nvidia will lock out non paying screens. I can't see a single reason this won't be the case longer term, the short term fpga solution is a clever one, but is expensive and can't possibly be a long term solution by any sensible monitor maker. No monitor makers want to design a screen then have a version which bypasses in built pieces and uses an external extra and overly expensive chip.
Hi Charlie
Why has AMD not implemented this 'trivial stuff' up to now ?
Its pure coincidence that DMs posts on here took a massive anti nvidia swing at the same time charlie's website did, and that DM started linking to it
Its pure coincidence that DMs posts on here took a massive anti nvidia swing at the same time charlie's website did, and that DM started linking to it