• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia, stop being a **** please

I'm sure they are helping :) As long as they are making a profit from each one, which they are, then it increases their market share, something AMD desperately need.

Trouble is Dave, it shows how Nvidia dominate the market and AMD just lap up seconds. We all want a healthy, competitive market but only the odd few choose AMD for some reason
 
We all want a healthy, competitive market but only the odd few choose AMD for some reason

Marcin Momot

CD PROJEKT RED:

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

Hope this information helps.

Main reason why I dumped AMD-Games Works on AMD is a joke-some of us believe we are not getting a healthy competitive market.:(
 
Is that because AMD is bad

It's because AMD couldn't get working profiles out for mgpu usually GW's titles.

There was the jibe 'Just turn it off' from the usuals, but since they run Nvidia, they aren't affected which makes it ok, because AMDmatt tells you to do it.

It's as if it's an AMD plot to make Nvidia look evil.

Those with half a brain however will realise-that most PC gamers(with no vendor preference) that run AMD gpu's, who just want to game will be ditching AMD and going Nvidia because GW's is in most AAA tiltles.


as per title? :P (given the quote from Momot :p )

Yes, if they weren't ****s, I'd probably still be there.

Moral of the story, winners are ****s.
 
It's because AMD couldn't get working profiles out for mgpu usually GW's titles.

MGPU affects a very small minority of gamers. The vast majority don't even care about highend, just low/mid and unfortunately for AMD they were not competing there with Nvidia until the R9 390 was released. Nvidia had the mid-range and high end GPU markets uncontested from Sept 2014 to June 2015 when the R9 390 and Fiji was released. The 290 and 290X cards were seen as old tech in comparison. Nvidia gained so much market in that period because AMD had nothing competitive for the best part of a year.

There was the jibe 'Just turn it off' from the usuals, but since they run Nvidia, they aren't affected which makes it ok, because AMDmatt tells you to do it.

Again the vast majority do not run MGPU and would not give two hoots about anyone stupid enough to use AMD MGPU (or even SLI) given the myriad of issues it brings. Even Nvidia took a fair while to get SLI profile for FO4 and had plenty of issues with W3 drivers.

It's as if it's an AMD plot to make Nvidia look evil.

Both are just corporations, the problem is that AMD are nowhere near as efficient in their execution and marketing.

Those with half a brain however will realise-that most PC gamers(with no vendor preference) that run AMD gpu's, who just want to game will be ditching AMD and going Nvidia because GW's is in most AAA tiltles.

Most AMD users who just want to game will avoid MGPU and get on just fine even with Gameworks. In fact W3 Gameworks effects were totally unusable on my 980 and 980Ti because it absolutely killed performance.

If AMD want to keep marketshare they need to remain competitive at all times. They just hadn't been doing that for the best part of a year and now they do at least offer viable alternatives at mid - high end so their marketshare will rise albeit slowly.
 
Last edited:
This has been stated multiple times before, the reason Nvidia need the gsync controller is because the display controllers in the current crop of Nvidia cards do not support adaptive-refresh. Although they can still send some custom commands to allow gsync to work with certain hardware. it was never designed to work with the original VESA eDP standard for adaptive-refresh.

But AMD has supported VESA eDP on the majority of its chips sine the 5000 series, just not in the way required for the current iteration of freesync to work. This is why some older card support Freesync but only under certain conditions and why it was demonstrated earlier on using laptop hardware.
Yes, and as already pointed out NVIDIA knew about adaptive sync for desktop long before it came out, being part of the group that came up with it. Yet they chose to keep support out of their recent desktop cards. Possibly this was a result of being too far down the design path on them already so changing would be too expensive. Hence it'll be interesting to see if they do the same with Pascal and rely on their lead in dGPU to keep GSYNC alive or if they choose to support the open standard they were part of creating.
 
The little powerhouse that is the 970 is getting GW's switched off@1080p on GFE now:

ElwenRb.png

:rolleyes:


@ICDP,

W3 GW's works on my 970, before GW's, mgpu worked fine on my 6970/7970/7950/290X setups with Nvidia sponsered non GW's titles.
 
The little powerhouse that is the 970 is getting GW's switched off@1080p on GFE now:

ElwenRb.png

:rolleyes:


@ICDP,

W3 GW's works on my 970, before GW's, mgpu worked fine on my 6970/7970/7950/290X setups with Nvidia sponsered non GW's titles.

Prior to gameworks MGPU was problematic for both AMD and Nvidia, even more so for AMD with frame pacing etc. The fact that MGPU really only ever works out of the box for AAA titles renders MGPU a waste of money for anyone who playes non-mainstream games.

I have used MGPU plenty of times in the past to know I won't do it ever again. Gameworks has almost nothing to do with the drop in AMD marketshare and the majority of people jumping ship. The major factor was down to the fact that if someone was in the market for a new mid - high range GPU from Sep 2014 - July 2015 the only viable options were Nvidia.

970 > 290/X mid
980 > 290X high

To counter these new Nvidia GPUS AMD simply kept on keeping on with 290 and 290X and they weren't perceived as competitive. Step forward to Mid 2015 and AMD finally release a competitive GPU in Fiji. Yes overall 980Ti is a better option at the high end but at least AMD have fairly viable alternatives. Same with 970 vs 390 and 980 vs 390X. At least AMD have GPUs that compete out of the box and are actually getting decent reviews.

People now see AMD GPUs being fairly competitive and start to think of them as viable alternatives. That is where marketshare is won or lost IMHO.
 
Last edited:
Just watched it, dont think I've ever watched such a whiney ****ing crying video in all my days lol. It's right up there with the famous 'leave britney alone video'.

I'm going to make a similar video and give it to my partner in work. I dont think its fair he makes a high six figure salary and he doesnt share it with me, it is not right and very anti-sharing. While were at it, I want a six bedroom house, its not fair 'I'm qq'ing right now' that I only have a 4, its not right, its not fair and it should be given to all of us. Also, I'm sure there was a £100m+ euromillions winner not long ago, he/she should def share that with us, its so unfair, sharing is giving, how dare they keep that all to themselves.
 
i liked this comment on youtube.

Paranoid nonsense. I can't believe people seriously imply that Nvidia goes in and forces developers to tessellate random geometry to make AMD look bad, or cripples their own performance in order to make AMD look marginally worse. Developers are responsible for optimization, not Nvidia, and developers have full control of which Gameworks features they use or don't. It's optional middleware that they can choose not to use entirely. There is no boogey man.

The main reason AMD releases everything opensource isn't because they're dogooders that want the best for gamers, it's because they're cloning features that Nvidia made first and AMD is playing catch up. The only way they can compete is by giving some incentive for developers, and opensourcing it is that incentive.
 
NVIDIA are a company. Their primary goal is to gain value for shareholders. That doesn't mean there is only one route they can take and we can certainly judge them for their business practises and deride them for it. While anti-competitive practises remain accepted in the US as part of a free market and the EU is decades too slow reacting to them we can expect them to continue to happen from many parties (not just NVIDIA!)
While fans of a brand or company continue to back them regardless they will never have the primary incentive to change of losing money. It's not even necessarily a bad thing at times - short-termism from consumers can also force businesses to adapt to demand, helps drive the pace of change in many areas and so on. Companies racing to gain a position that they can then abuse is in itself a driver for change, so the flawed system has advantages too. Overall in an area like this it's a negative and the video touches on some of the reasons why, regardless of the motives.

Honestly, given the choice between reforming anti-competitive laws or patent law I'd pick patent law every time as the bigger impediment to development and easiest to misuse.
 
i liked this comment on youtube.

Is that your post by any chance? Just your disclaimer below kinda implys it is lol.

How do we know how nVidia operate anyways? For all we know the default tessellation in gamework features may be way higher than needed and to drop it down the dev would either have to request it from nVidia or some how get hold of the source signing a NDA or what ever.

For all we know when signing the NDA Devs end up having certain restrictions which they are able or not able to do with the source and what you can modify! Who has seen the agreement and NDA or source for that matter. Your looking at things from the wrong perspective.

All game devs have control over is weather to use certain features or not rather than having control over how to optimise those features. And im certain there is a certain degree of struggle to optimise certain nVidias game work features from a Dev end of things. Probably why devs just recommend you turn them off. At least they give AMD that much lol. Just turn them off.
 
Except Nvidia didn't make any of it first. Some of it is inherited IP and most of it is copied from existing open source.
 
Last edited:
Except Nvidia didn't make any of it first. Some of it is inherited IP and most of it is copied from existing open source.

There is some truth to that but often that is the work of very clever people who've put the theory or proof of concept into the public domain but there is still a ton of work required to actually feasibly use it in a game - nVidia have taken that and wrapped it in something that a developer can plug into their game without having to fully understand the concepts behind it to utilise it.
 
Back
Top Bottom