• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

If AMD for whatever reason are denied access to optimise then yes.

The only way a GPU like that would ever keep up is if its not running efficiently on the 7950.

Isn't that they point here?

No I was just meaning the normal difference between TWIMTBP and GE titles. As in they're optimised better from the off on their respective cards. That couple with a scenario where the memory bandwidth on the 660 doesn't hamper it would bring them close. I don't see the difference being down purely down to GW although not ruling it out entirely.
 
No I was just meaning the normal difference between TWIMTBP and GE titles. As in they're optimised better from the off on their respective cards. That couple with a scenario where the memory bandwidth on the 660 doesn't hamper it would bring them close. I don't see the difference being down purely down to GW although not ruling it out entirely.

I don't believe that for a second.

I have never seen another TWIMTBP title other than Batman AO where the 660 is similar performance or faster than a 7950 Boost.

Have you?
 
Joel Hruska - Gregster • 3 hours ago −
AMD did not "set me up" for this article. They made me aware of their concerns. I investigated the issue and believe the concerns warranted. I also spoke to Nvidia and attempted to speak to WBM.

The benchmarks, data collection, and research for this story are entirely my own. Vendors regularly communicate with journalists regarding product performance of both their own hardware and that of their competition.

So AMD did set him up :D
 
Sorry, forgot the ';)'

It's just that you're taking what this guy says and making it mean what you want it to mean in order to support your point of view while the general Nvidia crowd claim the usual conspiracy theory and agenda accusations whenever something negative is said about Nvidia practices and demand absolute proof.
Take the Origin PC thing for example. How blatant was that? Yet the green defenders wanted nothing short of an official statement from Nvidia saying 'Yeah, we did it' or it must all be conspiracy theory BS from AMD fanboys...

All I'm saying is bias is on both sides of the argument here and people are ignoring anything that doesn't support their pre-defined POV, while trying their hardest to put down anyone that thinks otherwise.

Can you explain how a 770 is faster than a 290X with FXAA in BAO? Whatever the NV advantage is with FXAA, we're still talking about something that only costs a couple of fps with AMD cards.
 
Sorry, forgot the ';)'

It's just that you're taking what this guy says and making it mean what you want it to mean in order to support your point of view while the general Nvidia crowd claim the usual conspiracy theory and agenda accusations whenever something negative is said about Nvidia practices and demand absolute proof.
Take the Origin PC thing for example. How blatant was that? Yet the green defenders wanted nothing short of an official statement from Nvidia saying 'Yeah, we did it' or it must all be conspiracy theory BS from AMD fanboys...

All I'm saying is bias is on both sides of the argument here and people are ignoring anything that doesn't support their pre-defined POV, while trying their hardest to put down anyone that thinks otherwise.

Can you explain how a 770 is faster than a 290X with FXAA in BAO? Whatever the NV advantage is with FXAA, we're still talking about something that only costs a couple of fps with AMD cards.

The guy has admitted that AMD put him onto it. Even you must admit that this whole thing is a little 'blown out of proportion' and much ado about nothing? We did have the same thing with Batman AA way back in 2009 with AMD screaming "It's so unfair yada yada yada" And "it will ruin PC gaming" but 5 years on, I don't see PC gaming ruined and in fact, it looks damned good to me. :)
 
Even if they did put him onto it, why is that such a bad thing? He reckons the concerns are valid and if they are...?.
Have Nvidia never put anyone onto anything? How about the FCAT frametimes thing for example?

With BAO, the only way that AMD are running level or even slightly ahead is due to them being better at MSAA (according to them).
Sure, these are just benchmark scores and most decent cards can run the game well enough because it really isn't that hugely demanding tbh.
What happens when this is being applied to a game that does require more GPU beef?
Again, a 770 shouldn't be beating a 290X.

As it stands, yes, I agree that it's a little overblown right now. However, that doesn't mean it definitely isn't an ominous sign of things to come.

I know you wouldn't want a situation like that just as much as me or anyone that doesn't sport a GPU brand on their forehead :)
 
They made him aware of their concerns lol. Love it. It's one site, obviously there was motive behind it. Anyone can see that.

And Humbug, you asked for any scenario and both Farcry 3 and BF3 perform similarly with 4x MSAA. Says so in the link you've provided. So why is it such a stretch for AO to have similar results?

It's not, at all. There is no way of knowing currently why the drop at the top is so extensive but it is not by any means a stretch to assume it could be down to premature drivers on Hawaii's part. It's a common trend with AMD on new hardware.

Just take a look at some results from the 7990 at launch if you're not sure. AMD have thrown their toys out of the pram and frankly I very much doubt Josh was the first person they went to with this. Strangely though, nobody else has printed anything.
 
Last edited:
My mistake im getting confused with the titan launch, prior to the Tomb Raider launch. Regardless Nvidia were able to optimize for tressfx, the way it should be. Something amd are unable to do for GameWorks, so it changes nothing. IF AMD wanted to hurt Nvidia performance they would not use a DX11 API standard like Direct Compute which is open and can be optimized for.



I will say it again. There is no way a 660 should be faster than a 7950 at any setting. See the post below.

So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan

And now that Nvidias top cards can run tressfx without a massive performance hit it hasnt been seen in any other game, funny that

If you really cant see the bias in your own statements then there really is no point in having any type of discussion

The hypocrisy is off the scale, I dont mind people having a brand preference, I prefer nvidia, but to dress it up as "concern for gamers" is laughable

Both sides work with developers and a range of games end up with a slight bias towards one range of cards or the other, it is not surprising, the claim that GW totally prevents AMD from any optimisation and that as a result all AMD cards are at a disadvantage really isnt backed up by benches, you end up having to run very specific settings and only compare specific cards for the data to support the accusation, with a wider data set then no pattern occurs

But no, you are totally right, nvidia should be banned from doing any work that benefits their customers where as AMD should be given awards for developing features and tools that block out or hurt performance for nvidia customers, that sounds completely fair
 
So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan

I thought that was very comical too :P
 
Sorry, forgot the ';)'

It's just that you're taking what this guy says and making it mean what you want it to mean in order to support your point of view while the general Nvidia crowd claim the usual conspiracy theory and agenda accusations whenever something negative is said about Nvidia practices and demand absolute proof.
Take the Origin PC thing for example. How blatant was that? Yet the green defenders wanted nothing short of an official statement from Nvidia saying 'Yeah, we did it' or it must all be conspiracy theory BS from AMD fanboys...

All I'm saying is bias is on both sides of the argument here and people are ignoring anything that doesn't support their pre-defined POV, while trying their hardest to put down anyone that thinks otherwise.

Can you explain how a 770 is faster than a 290X with FXAA in BAO? Whatever the NV advantage is with FXAA, we're still talking about something that only costs a couple of fps with AMD cards.

Even if they did put him onto it, why is that such a bad thing? He reckons the concerns are valid and if they are...?.
Have Nvidia never put anyone onto anything? How about the FCAT frametimes thing for example?

With BAO, the only way that AMD are running level or even slightly ahead is due to them being better at MSAA (according to them).
Sure, these are just benchmark scores and most decent cards can run the game well enough because it really isn't that hugely demanding tbh.
What happens when this is being applied to a game that does require more GPU beef?
Again, a 770 shouldn't be beating a 290X.

As it stands, yes, I agree that it's a little overblown right now. However, that doesn't mean it definitely isn't an ominous sign of things to come.

I know you wouldn't want a situation like that just as much as me or anyone that doesn't sport a GPU brand on their forehead :)

Very well said Petey but the brand defenders are here in force and don't listen to reason nor common sense. They're far to dug in so its a case of Ignore, Deflect, Ignore, Deflect etc.



So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan

And now that Nvidias top cards can run tressfx without a massive performance hit it hasnt been seen in any other game, funny that

If you really cant see the bias in your own statements then there really is no point in having any type of discussion

The hypocrisy is off the scale, I dont mind people having a brand preference, I prefer nvidia, but to dress it up as "concern for gamers" is laughable

Both sides work with developers and a range of games end up with a slight bias towards one range of cards or the other, it is not surprising, the claim that GW totally prevents AMD from any optimisation and that as a result all AMD cards are at a disadvantage really isnt backed up by benches, you end up having to run very specific settings and only compare specific cards for the data to support the accusation, with a wider data set then no pattern occurs

But no, you are totally right, nvidia should be banned from doing any work that benefits their customers where as AMD should be given awards for developing features and tools that block out or hurt performance for nvidia customers, that sounds completely fair

Andy andy andy. The extra IQ features as you call it are part of DirectX. I don't see how you can compare that to GameWorks which is a close library which AMD are unable to optimise for. Its pretty clear you don't understand, ive tried to explain it for you several times but you just ignore what im saying then go off on a tangent about AMD only using Direct Compute because mid range Kepler cards decided to gimp that feature at Nvidias request. Its madness.

Even funnier that you now speculate that because Nvidia have optimised their drivers for TressFX AMD have decided to stop using it. Actually AMD are working on a new and improved version of TressFX which is less demanding. It will be appearing in Star Citizen. If AMD had used a closed library like GameWorks and not an Open API like DirectCompute Nvidia would have never been able to optimise their drivers for it.

As for the claims about GameWorks its backed up perfectly via the article, via techspot and via user benches. Its only when AA is applied that AMD is able to overpower the GameWorks advantage in this particular title and pull ahead through brute force. Its not normal for a 660 to beat a 7950 boost.
 
Last edited:
So it is totally ok for AMD to develop extra iq features knowing full well that it would hurt nvidia users because "its open", even when it makes nvidias top card look like an AMD mid range card
but it isnt ok for nvidia to develop extra iq features even when multiple benches across a range of games and settings show that the 290x still beats a titan

And now that Nvidias top cards can run tressfx without a massive performance hit it hasnt been seen in any other game, funny that

If you really cant see the bias in your own statements then there really is no point in having any type of discussion

The hypocrisy is off the scale, I dont mind people having a brand preference, I prefer nvidia, but to dress it up as "concern for gamers" is laughable

Both sides work with developers and a range of games end up with a slight bias towards one range of cards or the other, it is not surprising, the claim that GW totally prevents AMD from any optimisation and that as a result all AMD cards are at a disadvantage really isnt backed up by benches, you end up having to run very specific settings and only compare specific cards for the data to support the accusation, with a wider data set then no pattern occurs

But no, you are totally right, nvidia should be banned from doing any work that benefits their customers where as AMD should be given awards for developing features and tools that block out or hurt performance for nvidia customers, that sounds completely fair

Very well said.
 
This concerns me tbh, but no one makes a thread stating that mantle could make nvidia 780ti cards look like a 6950 in a lot of future games.

Quote "Of course heres the thing we are not sure about. Mantle was clearly designed with GCN in mind, so when AMD talks about other vendors being able to utilize Mantle does that mean that Mantle will work on their current Architecture? Or will the actual architecture of rival vendors (Nvidia) be need to be modified to support Mantle?"

Read more: http://wccftech.com/amd-mantle-api-require-gcn-work-nvidia-graphic-cards/#ixzz2pERWrXZ8

IF nvidia achitecture has to be 'modified' i dont think this will help the gaming community at all on nvidia cards as at what cost to the consumer will this happen, IF it is the case.
 
Very well said Petey but the brand defenders are here in force and don't listen to reason nor common sense. They're far to dug in so its a case of Ignore, Deflect, Ignore, Deflect etc.





Andy andy andy. The extra IQ features as you call it are part of DirectX. I don't see how you can compare that to GameWorks which is a close library which AMD are unable to optimise for. Its pretty clear you don't understand, ive tried to explain it for you several times but you just ignore what im saying then go off on a tangent about AMD only using Direct Compute because mid range Kepler cards decided to gimp that feature at Nvidias request. Its madness.

Even funnier that you now speculate that because Nvidia have optimised their drivers for TressFX AMD have decided to stop using it. Actually AMD are working on a new and improved version of TressFX which is less demanding. It will be appearing in Star Citizen. If AMD had used a closed library like GameWorks and not an Open API like DirectCompute Nvidia would have never been able to optimise their drivers for it.

As for the claims about GameWorks its backed up perfectly via the article, via techspot and via user benches. Its only when AA is applied that AMD is able to overpower the GameWorks advantage in this particular title and pull ahead through brute force. Its not normal for a 660 to beat a 7950 boost.

tressfx is not part of DX11, it is a specific feature that was added to a game by AMD... the article you linked to even goes in to detail about how AMD hurt their own users performance and that the upshot of that was that it hurt Nvidia users more (sound familiar?)

NVidia's performance on 780 vs. 680 has nothing to do with driver optimisation, the 680 lacked hardware (at developers and gamers request) that could run that code efficiently, tressfx release was specifically targeted at making NVidia's cards look poor value and AMD cards look better value to potential customers - a marketing tool

it isn't normal for a 7870 to beat a 680, yet that is exactly what tressfx enabled in TR

being able to optimise your drivers for a feature is irrelevant if your hardware lacks the capability to run it - same with tressfx at the time and the same with mantle for god knows how long

mantle being "open" is only a valid point once AMD add support for Intel and Nvidia GPU's

as others have pointed out there are lots of "closed librarys" in use by devlopers other than NVidia's, and yet both sides have no problems optimising drivers for those

this entire thread is purely a case of your rabid support for anything AMD and your inherent need to have a pop at NVidia, but you can't have it both ways

I bought NVidia because I see the value in their extra features, you bought AMD because you didn't, you can't then complain when game devs leverage NVidia's tools to make their games and then on specific settings in specific circumstances the cards that you own don't perform quite as well as they do in some other games
AMD do the exact same thing and it is only your rabid support of all things rose tinted that prevents you from accepting that
 
Last edited:
tressfx is not part of DX11, it is a specific feature that was added to a game by AMD... the article you linked to even goes in to detail about how AMD hurt their own users performance and that the upshot of that was that it hurt Nvidia users more (sound familiar?)

NVidia's performance on 780 vs. 680 has nothing to do with driver optimisation, the 680 lacked hardware (at developers and gamers request) that could run that code efficiently, tressfx release was specifically targeted at making NVidia's cards look poor value and AMD cards look better value to potential customers - a marketing tool

being able to optimise your drivers for a feature is irrelevant if your hardware lacks the capability to run it - same with tressfx at the time and the same with mantle for god knows how long

mantle being "open" is only a valid point once AMD add support for Intel and Nvidia GPU's

as others have pointed out there are lots of "closed librarys" in use by devlopers other than NVidia's, and yet both sides have no problems optimising drivers for those

this entire thread is purely a case of your rabid support for anything AMD and your inherent need to have a pop at NVidia, but you can't have it both ways

I bought NVidia because I see the value in their extra features, you bought AMD because you didn't, you can't then complain when game devs leverage NVidia's tools to make their games and then on specific settings in specific circumstances the cards that you own don't perform quite as well as they do in some other games
AMD do the exact same thing and it is only your rabid support of all things rose tinted that prevents you from accepting that

TressFX uses DirectCompute which is part of DirectX11. So yes AMD used part of DirectX 11. That's completely different from what you're saying. GameWorks is not an API of DirectX 11. Its a closed library which neither the dev nor AMD can optimise for. It couldn't be more different. Even if the developer wanted to help AMD with optimisations, it cannot. That is vendor lock in right there. Not even in the same ball park as AMD using part of DX11 (the most OPEN API accesibles to everyone, Devs and Nvidia drivers) to create hair physics.

The problem with TressFX was actually Nvidia drivers. The creators of the game worked closely with Nvidia to optimise drivers shortly after the game was released. The fault lies with Nvidia for lacking for DX11 DirectCompute support. Not with AMD for using the most open standard API going.

http://www.ausgamers.com/news/read/...ressfx-memory-costing-plus-new-nvidia-drivers

We’ve been working closely with NVIDIA to address the issues experienced by some Tomb Raider players. In conjunction with this patch, NVIDIA will be releasing updated drivers that help to improve stability and performance of Tomb Raider on NVIDIA GeForce GPUs. We are continuing to work together to resolve any remaining outstanding issues. We recommend that GeForce users update to the latest GeForce 314.21 drivers (posting today) for the best experience in Tomb Raider.

So there we have it. The dev able to work with Nvidia to optimise drivers after launch, like it always is. Only this never happens with GameWorks, as most of it is locked down so only Nvidia can optimise for. See the problem now? Of course you don't. You still think AMD used DX11 to harm Nvidia.

AMD do not do the same thing and only your inability to understand whats going on prevents you from thinking otherwise. You're so entrenched in your beliefs that you cannot listen to reason. Instead preferring to ignore and deflect everything i say. Only to then try and turn it around by saying look here, AMD do the same thing when its clearly different. If you think using DX11 is similar to using GameWorks which block dev and AMD driver optimisation then you're beyond debating with as if you don't get it by now, you never will.
 
Last edited:
This concerns me tbh, but no one makes a thread stating that mantle could make nvidia 780ti cards look like a 6950 in a lot of future games.

Quote "Of course heres the thing we are not sure about. Mantle was clearly designed with GCN in mind, so when AMD talks about other vendors being able to utilize Mantle does that mean that Mantle will work on their current Architecture? Or will the actual architecture of rival vendors (Nvidia) be need to be modified to support Mantle?"

Read more: http://wccftech.com/amd-mantle-api-require-gcn-work-nvidia-graphic-cards/#ixzz2pERWrXZ8

IF nvidia achitecture has to be 'modified' i dont think this will help the gaming community at all on nvidia cards as at what cost to the consumer will this happen, IF it is the case.
People are STILL comparing GameWorks to Mantle purely because of performances? How about actually comparing the approach of the features instead?

In race-car terms it would be like:
Mantle- AMD fine-tuning their engine to increase the efficiency and squeeze more performance out of it, hoping to increase the performance gaps between them and its rival

GameWorks- Locking the bonnet of their rival's vehicle (which only them has the key for opening it) and deny them from accessing the cores parts and components

It's fine because Nvidia is the one doing it, so people want to turn an blindeye to it; if it was the other way round, people will curse AMD to infinity and beyond :p

Take the Splinter Cell Blacklist performance for example with the 7970GHz being slower than the GTX760, most wouldn't want to take about it; imagine of it was on BF4, or other EA titles that the GTX780 was slower than the 7950...people would be bashing AMD to the ground for nerfing Nvidia's performance.

In complete honesty I think people are...naive to say AMD can trust their performance or optimisation to Nvidia, considering it is the very same people would pull the stunt of disabling PhysX when AMD card is detected, despite people actually PAID for the Nvidia card that they have/had. Any minutue now, someone will come in and try to justify for Nvidia blocking the PhysX and make it seem justified :p
 
Last edited:
Back
Top Bottom