• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gameworks, Mantle and a pot calling a kettle black

NVidia are the good guys if you are a customer, like all of the reputable brands/manufacturers that people pay higher than average prices for they do everything they can to improve their customers experience.
Right...like people switching from Nvidia to AMD on their upgrade is a not counted as their customer, because they never spent money on buying the Nvidia card right? They punish the "traitors" that dares leaving the gang by disabling PhysX despite their existing Nvidia card CAN work as dedicated PhysX with an AMD card as primary, if not for the block they set up?
 
Last edited:
I don't think anyone actually agrees that Nvidia's lock out of PhysX if there's an AMD card is a good thing. So using it as a rebuttal can get moot, because they're in agreement anyway :p
But mmj's comment there is far too blanket lol.
 
But mmj's comment there is far too blanket lol.
Same as his remark of Nvidia being more engaged with developers than AMD. Few years ago I would totally agree, but at the present, I wouldn't say such a blanket statement unless I got solid prove that is the case.

His claim of "NVidia charge more, release slightly less well specced cards but invest and put the work in behind the scenes to ensure that the money their customers have spent is rewarded with top compatiblity/performance" ironically contradict the performance and reliability of the games by Ubisoft, who is their biggest partner...

Just to make it clear, I never claim AMD's partner's games are reliable and perfect, it was him that bought up that point of being Nvidia's strength...
 
Last edited:
Just to get back on topic, I forgot to mention what John McDonald had said and I can understand his frustration.

It is extremely frustrating to see an article criticizing work you did at a former employer and not being able to comment that the person who you are quoting from was just completely full of unsubstantiated [expletive]. Thanks, Forbes. . . . [A]nd while I never did, and certainly do not now, speak for nvidia, let me say that in the six years I was in devtech I *never*, not a single time, asked a developer to deny title access to AMD or to remove things that were beneficial to AMD.

And John's Bio:

John McDonald is a senior software engineer at NVIDIA Corporation. For the past six years, he has worked in developer technology, where he works with game developers on performance and functional issues in their titles. Prior to joining devtech, John helped design the 8xxx, 9xxx and gt2xx series of GPUs. Before coming to NVIDIA, John worked in the game industry on several titles, including the huge hit Command & Conquer: Generals. When not at work, John can be found spending time with his family or sculling his way down Lady Bird Lake in Austin, TX.
 
Right, don't know how much the 4130 effects this, though looking at GPU usage not a lot if any (mostly 99% usage)

Older 13.12 drivers, and overclocked to 1100/1350 (according to Lord Humbug, Titan beating performance)
AA - Off
gLtBbMU.jpg

FXAA- High
IC2zrua.jpg

8xMSAA
99MEosa.jpg
 
Just to get back on topic, I forgot to mention what John McDonald had said and I can understand his frustration.



And John's Bio:

John McDonald is a senior software engineer at NVIDIA Corporation. For the past six years, he has worked in developer technology, where he works with game developers on performance and functional issues in their titles. Prior to joining devtech, John helped design the 8xxx, 9xxx and gt2xx series of GPUs. Before coming to NVIDIA, John worked in the game industry on several titles, including the huge hit Command & Conquer: Generals. When not at work, John can be found spending time with his family or sculling his way down Lady Bird Lake in Austin, TX.

Honestly Greg,its depressing reading the utter tosh thrown about in this thread. Tin pot theories and outright nonsense. Why always "Vs"? Why does something have to fight something else in some kind of GPU super smash bros :confused:

edit- thats not aimed at you Greg,just linked your post because I liked it :)
 
Last edited:
Did the same test as Tonester and nVidia are gimping the game for nVidia SLI users at 8XMSAA :D

no AA
ed855bef17df200f19591cb1adf2c2cd.jpg

FXAA-high
70ddd0eb3629f5e72814462cb2d481c9.jpg

8XMSAA
4edecd926073279000efa86478175053.jpg

Gulp!
 
So then don't we have proof that it's not FXAA killing AMD? FXAA being one of the GW libraries.
Ain't that just proving it is not "features" that can be enable/disabled that's killing AMD's performance?

I don't exactly recall people was specifically blaming the "features" of GameWorks for killing the performance, all it does prove the the "performance hit" "could" have already existed before FXAA even came into play.

I know some people insisting GameWorks won't do anymore other than the offering the specific features, but who know the truth about how does it exactly entangled with the core game itself?
 
Ain't that just proving it is not "features" that can be enable/disabled that's killing AMD's performance?

I don't exactly recall people was specifically blaming the "features" of GameWorks for killing the performance, all it does prove the the "performance hit" "could" have already existed before FXAA even came into play.

I know some people insisting GameWorks won't do anymore other than the offering the specific features, but who know the truth about how does it exactly entangled with the core game itself?

GW is feature libraries. They can be turned on and off. If not using them there will be no impact on performance (unless proven otherwise) so any issues needs to fall at the feet of AMDs driver team.
 
Tonester's results:

AA - Off
gLtBbMU.jpg


FXAA- High
IC2zrua.jpg


8xMSAA
99MEosa.jpg

Rusty's results:
AA - Off
HYhdDRb.png


FXAA - High
EYPkWpR.png


8xMSAA
snGgp3R.png

So let us compare the relevant drop offs:

Moving from no AA to FXAA results in a 3.3% drop off in performance for Tonester's 290.

Moving from no AA to FXAA results in a 3.9% drop off in performance for Rusty's 780.

(a statistically insignificant difference)

The MSAA results are there too if anybody is interested but wanted to keep this to the issue of GW and FXAA isn't having more of a drop off for AMD users than it is nVidia users. So you can only summarise that the reviews from back in the day which showed a huge drop off comparatively for AMD users was because AMD had not optimised properly (be it GW or just the game in general).
 
Give or take, FXAA gives you the same performance hit Greg.
Bloody GW!! Always stealing FPS equally across both vendors, i wont stand for it.

This whole debacle started because of Batman Arkham Origins and the guy at Extremetech was insistent that nVidia were using "Underhand tactics" to gimp performance on AMD hardware and was claiming a 770 was beating a 290. Clearly if that is happening, it isn't the GameWorks libraries at play or nVidia gimping or the devs refusing to allow AMD code but seems more a case of AMD and others wanting to see something that isn't there.

I had some shreddies this morning and I am sure I see Jesus in my milk, so must be true :D
 
Back
Top Bottom