• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel plans to support FreeSync.

Nope, as it still means people wouldn't have the freedom to switch to using AMD cards and adaptive sync without going through selling off their existing monitor.

However you comment did make me realise something...rather than expecting Nvidia to adaptive-sync, the other possibility is probably for monitor manufacturers to release monitors that support both adaptive-sync and Gsync as standard...but the key thing is the price premium that Gsync add to the cost of the monitor must be low. As it stands, £150 for the extra Gsync module on is simply too much...if it was only around ~£50, then it might work.

Unfortunately I can't see nvidia agreeing to this :(
 
Wow i love how nVidia damage control come rolling in after my post.


https://youtu.be/EUNMrU8uU5M?t=13m31s

Can't find where i saw their statement saying vulkan wouldn't have been possible without AMDs help but here is the video where i saw them make a special thanks to AMD after thanking all the other guys who contributed their help in other ways. Valve got a big thanks too. Guess what no nVidia gets a big mention which to me says they didn't contribute that much so once again
from the horses mouth!

THANKS AMD :D because we wouldn't be where we are today without mantle!

So in other words, no you can't at the moment. That is all you had to say, not use the nvidia diversionary tactics.

I would be shocked if they did not thank AMD for handing over the code. It is good they did acknowledge it.
 
So in other words, no you can't at the moment. That is all you had to say, not use the nvidia diversionary tactics.

I would be shocked if they did not thank AMD for handing over the code. It is good they did acknowledge it.

diversion tatics? LOL it was in response to Lamb choppys comment to nVidia developing Vulkan... they helped not developed. If you want to consider nVidia developing Vulkan then i guess AMD developed it more!

And well atleast im showing how AMD have contributed more than what others are saying which are insinuating nVidia play a big part in the development or pretty much are developing vulkan just because the president of chronos works for nvidia.

I don't see big thanks or mentions to nVidia. As you see in the video they say
we would not be where we are today without mantle!
 
LOL people still trying to use the false equivalence lock-in talking point. They really ought to replace that in the advocacy program's literature.

NV should be using async, the open standard. End of.
 
LOL people still trying to use the false equivalence lock-in talking point.

NV should be using async, the open standard. End of.

Yea this!

nVidia show no signs of being a open standard business. Rather lock you down to their hardware business. AMD however are certainly more for open standards and can clearly be seen. However you have the usual people defending nVidia trying to make out nVidia don't lock you down and are open. It just aint how nVidia do business lol.
 
Nvidia use all the tactics they can to keep people locked in.
They have essentially made PC gaming into a console style ecosystem with two different players.

The only reason they even opened Cpu Physx code is to stop devs using other tech such as Havok or Bullet.
If only AMD developed such tech for the consoles. They would dominate with the console physics since all devs would use it, and consequently it would be ported to PC gpu's too.
 
Last edited:
Yea this!

nVidia show no signs of being a open standard business. Rather lock you down to their hardware business. AMD however are certainly more for open standards and can clearly be seen. However you have the usual people defending nVidia trying to make out nVidia don't lock you down and are open. It just aint how nVidia do business lol.

I am sure nvidia don't really give a crap about open standards, there not number 1 in the discreet graphics market because they gave away their ideas for free.

I am also sure that if amd were number 1 they would be charging more for their cards (like the furys ) which after all is said and done we all choose to buy so we are able to use all this "tech"
 
I am sure nvidia don't really give a crap about open standards, there not number 1 in the discreet graphics market because they gave away their ideas for free.

I am also sure that if amd were number 1 they would be charging more for their cards (like the furys ) which after all is said and done we all choose to buy so we are able to use all this "tech"

Yea they are number one because they make good or rather the best discrete graphics cards. irellevant to their open standards approach because they make the majority of their money from GPU sales. All their open standards approach does is make life for gamers more painfull! however it does help them to make that little bit more money from gamers by squeezing that bit more out of them. Good old business plan nVidia!
 
NV should be using async, the open standard. End of.

I'll agree with this statement. Not only because it then eliminates the whole G-Sync vs FreeSync lock-in on monitors, but with AMD, Intel AND nVidia using the same technology it will progress quicker, eliminating the "but g-sync is better wah wah" argument. The technology as a whole would be vastly superior than g-sync with 3 entities on it.
But that's not within nVidia's usual business tactics unfortunately -- that would be the ideal though imho.
 
I'll agree with this statement. Not only because it then eliminates the whole G-Sync vs FreeSync lock-in on monitors, but with AMD, Intel AND nVidia using the same technology it will progress quicker, eliminating the "but g-sync is better wah wah" argument. The technology as a whole would be vastly superior than g-sync with 3 entities on it.
But that's not within nVidia's usual business tactics unfortunately -- that would be the ideal though imho.

Gsync gives Nvidia end to end control of the experiance. Adaptive sync users are at the mercy of what ever bargain basement scaler the manufacturer had lying around. We see this today with displays that have substandard working framerate ranges (144Hz that only work upto 90fps) or issues with overdrive and overshoot as the scaler is a generic part.

Nvidia innovated and brought this technology to market before anyone had even thought of using the technology in eDP for discreet displays, just because they didn't want to help out their competition does not make them the bad guys.
 
Anyone seen this? @ 11:40.

It seems on Input Lag Nvidia do better with V-Sync while AMD do better without V-Sync





 
Last edited:
Man I wish some people would read up before posting silly statements like "Nvidia should be using A-Sync, end of"... when G-Sync was being sorted, there was no other way than to do it on the GPU, so they had to do it via a monitor module. They were a year ahead of the competition but now some feel they should just switch to A-Sync to appease a massively minority market???

Not going to happen!!
 
A-sync is a VESA standard though so when monitor makers start adding it as standard hopefully all monitors will have this feature eventually thus benefiting everyone rather than just a few rich gamers. Nvidia may or may not support it but with Intel on-board it will have a bright future.
 
A-sync is a VESA standard though so when monitor makers start adding it as standard hopefully all monitors will have this feature eventually thus benefiting everyone rather than just a few rich gamers. Nvidia may or may not support it but with Intel on-board it will have a bright future.

I see no reason why Nvidia won't support it in the future if it is a standard built into all new monitors. I would see it as good business sense as well.
 
Anyone seen this? @ 11:40.

It seems on Input Lag Nvidia do better with V-Sync while AMD do better without V-Sync

[RL]

Erm, neither gsync nor freesync work at 200fps, that isn't a test of either, its a test of everything off

Second, tftcentral tested the rog swift as having a 5ms input lag, so...
 
They will probably just end up just adopting a-sync and charging extra for the gsync module in the NVidia screens and still selling them so they are not tied to a gpu but would still need a gsync screen to use the NVidia gsync.

people would still buy it as NVidia gpus as they are majority gpu share and will need to pay to us it with an NVidia gpu.
 
Erm, neither gsync nor freesync work at 200fps, that isn't a test of either, its a test of everything off

Second, tftcentral tested the rog swift as having a 5ms input lag, so...


No, G-Sync doesn't work at 200 FPS as the Screen is not 200 FPS, by the looks of it you need to have V-Sync on for g-Sync to work where as Free-Sync will work at any res as it works with V-Sync off.
 
No, G-Sync doesn't work at 200 FPS as the Screen is not 200 FPS, by the looks of it you need to have V-Sync on for g-Sync to work where as Free-Sync will work at any res as it works with V-Sync off.

Gsync works with vsync off now as well, freesync only also only takes effect within the range the monitor supports and up to the max refresh of the monitor, so no freesync doesnt work at 200fps

Both gsync and freesync turn themselves off when the fps goes above the refresh of the monitor, think about it, they have to

to run gsync you have to turn vsync off in game, you can then also set vsync on or off in the 3D settings in NVCP, it is quite a few settings to fiddle with and Linus doesn't show how and where he was turning vsync on and off, but looking at his results compared with what TFTCentral got with vsync and gsync off testing, I am guessing he possibly had a wrong setting on somewhere

just went through the video, just after 13:30 he mentions using Gsync and "turning on vsync in the game"... if you turn vsync on in game it disables gsync, so I seriously question his results, basically his 45fps "vsync off" results are actually using gsync, but his 45fps "vsync on" tests are using traditional vsync, not gsync, which explains why his results flick back and forth from good to bad
 
Last edited:
Back
Top Bottom