• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

have AMD stopped competing

So AMD would have to buy Nvidia GPUs to beat Nvidia, see that's how bad their own designs are :p Glad you see sense to admit this.

Firstly, I dont think AMD designs are bad...they actually push tech in the right direction. They just havent got the money or the nouse to get the Devs onside. Lets be honest here, if Devs actually supported both cards equally for their respective features and got the utmost out of both sides technologies then I believe a lot of these gaming FPS graphs would be much closer than they are now. But Devs are not going to put in the effort into extra work for less than 30% of the market...even at the expense of pushing graphics tech further and using the better API's (Even Nvidia improved FPS in DOOM on Vulkan). It's never necessarily been the issue with competing hardware...but boy is it more difficult when the software isnt there to use the hardware to it's fullest. Nvidia's masterstroke was, and still is Gameworks and the fact they got so many Devs tied into it. When Gameworks does its magic it's obviously a complete fluke that it affects AMD cards much more than their own. ;):p

Lastly, I am not admitting anything. I am not from AMD so have nothing to admit. I am just saying it like it is. I also own both Nvidia and AMD cards so I do not religiously follow either side. I do want AMD to bring out a corker of a card and do to Nvidia what they did to Intel with Ryzen. Why? Simply to put a bomb under the graphics card market and shake things up in what should be a more competitive space, which would be good for all of us.:D
 
Lets get one thing straight right now, NVidia will never adopt Freesysnc, they may very well adopt Adaptive-sync
It's the same thing, AMD just use a gimmicky name for it (you can thank Drunkenmaster/AMDMatt for that lol), similar to how SEGA referred to Direct memory access as "Blast Processing" or how Apple refer to IEEE 1394 as "FireWire". When people talk about Nvidia adopting Freesync they are referring to the tech itself not the AMD name specifically, it's just the AMD name is what everyone knows it by and refers too it as (just like with FireWire).
 
You mean, apart from Greed. :p

Joking aside, surely it would be more lucrative for Nvidia to support Adaptive Sync.

I am sure this would make more money for them, as AMD users with a Freesync Screen could then buy an Nvidia card (If they so wished) and keep their current monitor.

Currently if you have an AMD GFX Card and Freesync Screen you would need to change both of them to get an Nvidia based VRR setup. If Nvidia supported Adaptive Sync they would only need to change the card, which is obviously cheaper to do but would make the user jump from one ship to the other. That has to be more lucrative for Nvidia and earn them more than just the £150 - £200 they get from each G-Sync Module (I have no idea how much Nvidia actually charge for the module and I am just going on differences in monitor pricing. If what Nvidia charges for the Gysnc Module is cheaper than the overall price difference then that gives my arguement even more weight). :)
I could be mistaken, but Nvidia do support Adaptive Sync when it suits their application- the "G-sync" they called on gaming laptop with GeForce GPU are actually running Adaptive Sync rather than G-sync with a actual module?

But of course they won't do it for actual gaming PC and monitor purely for the monetising opportunities (or should I say the "lack of it"?).
 
As is/does the GP100/GP102

The GP102 has very poor FP16 and FP64 compute in comparison.Its a FP32 focussed card,ie,which most consumer orientated operations like gaming are.

The GP100 OTH,does have significant FP16 and FP64 performance:

https://www.anandtech.com/show/11102/nvidia-announces-quadro-gp100
https://www.anandtech.com/show/1136...v100-gpu-and-tesla-v100-accelerator-announced

The GP100 is 610MM2,as opposed to the 473MM2 of the GP100. It has the same number of shaders,but uses HBM2 and has tensor cores made for FP16 performance:

https://devblogs.nvidia.com/programming-tensor-cores-cuda-9/

Tensor Cores operate on FP16 input data with FP32 accumulation.

Core clocks are down over the consumer version,and it never shipped in a version with all its shaders enabled.

The GP102 is basically the GP100 stripped of its tensor cores and using GDDR5X,and has more usable cores.Look at how much smaller it is.

So,much better optimised for gaming.

However,for things like deep learning training,FP16 performance is important.

The first big Vega based product announced was for deep learning primarily deep learning training operations:

https://videocardz.com/64677/amd-an...erator-radeon-instinct-mi25-for-deep-learning

Vega has a massive increase in FP16 throughput over the previous GPUs AMD used.

That is the main issue - Nvidia has so much R and D money,they can effectively split lines,and sell dedicated GPUs for deep learning,etc and gaming.

Unless AMD can do that at the high end,I think they won't be able to effectively compete.
 
Last edited:
I could be mistaken, but Nvidia do support Adaptive Sync when it suits their application- the "G-sync" they called on gaming laptop with GeForce GPU are actually running Adaptive Sync rather than G-sync with a actual module?

But of course they won't do it for actual gaming PC and monitor purely for the monetising opportunities (or should I say the "lack of it"?).

Yes - the reason we need all this faffing around on desktop is due to the display standards used. Mobile FreeSync and GSync are realistically doing the same thing. AMD showed off FreeSync on a laptop first.
 
Yes - the reason we need all this faffing around on desktop is due to the display standards used. Mobile FreeSync and GSync are realistically doing the same thing. AMD showed off FreeSync on a laptop first.
Yea, but Nvidia calling the Sync on the laptop "Gsync" is kind of deliberate misleading, considering it doesn't have any added benefit/feature of with the Gsync module on monitors? It's essentially naming two different things with the same name (well, not surprising I guess considering how they mislead people into thinking the 1060 3GB and 1060 6GB are just the same card with different amount of memory when it's not).
 
Firstly, I dont think AMD designs are bad...they actually push tech in the right direction. They just havent got the money or the nouse to get the Devs onside. Lets be honest here, if Devs actually supported both cards equally for their respective features and got the utmost out of both sides technologies then I believe a lot of these gaming FPS graphs would be much closer than they are now. But Devs are not going to put in the effort into extra work for less than 30% of the market...even at the expense of pushing graphics tech further and using the better API's (Even Nvidia improved FPS in DOOM on Vulkan). It's never necessarily been the issue with competing hardware...but boy is it more difficult when the software isnt there to use the hardware to it's fullest. Nvidia's masterstroke was, and still is Gameworks and the fact they got so many Devs tied into it. When Gameworks does its magic it's obviously a complete fluke that it affects AMD cards much more than their own. ;):p

Lastly, I am not admitting anything. I am not from AMD so have nothing to admit. I am just saying it like it is. I also own both Nvidia and AMD cards so I do not religiously follow either side. I do want AMD to bring out a corker of a card and do to Nvidia what they did to Intel with Ryzen. Why? Simply to put a bomb under the graphics card market and shake things up in what should be a more competitive space, which would be good for all of us.:D

I was just commenting on when you said, "I still think AMD would have been better off putting a Dual Pascal card out". Pascal is a Nvidia GPU.
 
Yea, but Nvidia calling the Sync on the laptop "Gsync" is kind of deliberate misleading, considering it doesn't have any added benefit/feature of with the Gsync module on monitors? It's essentially naming two different things with the same name (well, not surprising I guess considering how they mislead people into thinking the 1060 3GB and 1060 6GB are just the same card with different amount of memory when it's not).

It also shows you how many people don't read proper reviews anymore where these things tend to be mentioned. They are more worried by what their favourite streamer or some YT personality(with all their branded gear which is not at all product placement) says.
 
No!

570 from £229 and 580 from £299 at OcUK, cheapest in UK by miles, just use the voucher codes which are active. :)
 
Yea, but Nvidia calling the Sync on the laptop "Gsync" is kind of deliberate misleading, considering it doesn't have any added benefit/feature of with the Gsync module on monitors?

Are they not free to call it what they like then, accept freesync of course being a copyrighted name owned by AMD?

Also what added benefits/features does the module bring that are not available without one? Remember that anything you list here will not be available to AMD's Freesync which of course doesn't use a module. ;)
 
What does "stopped competing" mean?

Stopped competing means that AMD has nothing in the pipeline that will be released soon, and they don't work anymore on designs which will bring the performance crown back.

Has there even any rumours of AMD bringing out anything this year?

No, the last thing I saw is some information about 7nm Vega sampling by the end of this year.
 
It's the same thing, AMD just use a gimmicky name for it (you can thank Drunkenmaster/AMDMatt for that lol), similar to how SEGA referred to Direct memory access as "Blast Processing" or how Apple refer to IEEE 1394 as "FireWire". When people talk about Nvidia adopting Freesync they are referring to the tech itself not the AMD name specifically, it's just the AMD name is what everyone knows it by and refers too it as (just like with FireWire).
The problem is that you cannot buy an adaptive sync monitor, every monitor on the net quotes the freesync name in big bold letters.
 
The problem is that you cannot buy an adaptive sync monitor, every monitor on the net quotes the freesync name in big bold letters.

Exactly..pure marketing genius from AMD, one of the few things they have done right lately.
 
Are they not free to call it what they like then, accept freesync of course being a copyrighted name owned by AMD?

Also what added benefits/features does the module bring that are not available without one? Remember that anything you list here will not be available to AMD's Freesync which of course doesn't use a module. ;)
No what I meant was Nvidia already have "Gsync", which I thought under the general opinions is superior form of Adaptive Sync, due to the present the module allowing wider sync range and higher refresh rate on monitor is it not?

By calling using standard Adaptive Sync with GeForce GPU on the laptop without the present of the module and call it "Gsync", is it really the case that it's up to Nvidia to call whatever they wish to call it, when they already have something different under the same name?
 
The problem is that you cannot buy an adaptive sync monitor, every monitor on the net quotes the freesync name in big bold letters.
I'm pretty sure my monitor actually call the feature Adaptive Sync rather than officially calling it "Freesync". In fact, I will go correct my signature right now :p

But yea Freesync is pretty much just Adaptive Sync with AMD made a name for it. I don't think it is right for Nvidia to call their laptop Adaptive Sync counterpart Gsync as well as it is not the same as the employ Gsync on the monitor.

If Nvidia was to emply the use of Adaptive Sync, I think they really should use a different name to avoid the confusion and to distinguish it from the Gsync.

As Gsync taken the "G" from "GeForce", may be Nvidia could do Adaptive Sync and take the "N" from Nvidia and call it Nsync...ok, may be that's not a good idea :D
 
Are they not free to call it what they like then, accept freesync of course being a copyrighted name owned by AMD?

Also what added benefits/features does the module bring that are not available without one? Remember that anything you list here will not be available to AMD's Freesync which of course doesn't use a module. ;)

AFAIK G-Sync on laptops uses eDP adaptive sync which existed before DP adaptive sync and G-Sync because of the different way laptop displays work with the scaler which allows things to work a bit different again.

The G-Sync module can be used to more tightly integrate overdrive type functionality with the variable refresh for enhanced pixel response, the extra buffer(s) made possible at the monitor end make it easier to implement advanced handling of window/desktop variable refresh and potentially other advanced features and seem to facilitate more advanced techniques for dealing with low framerates.
 
Stopped competing means that AMD has nothing in the pipeline that will be released soon, and they don't work anymore on designs which will bring the performance crown back.



No, the last thing I saw is some information about 7nm Vega sampling by the end of this year.

Disappointing
 
Back
Top Bottom