• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Huddy on Gameworks round 3

I find this "amd vs nvidia" so crazy, the rivalry, the hair works vs tress fx, why do we care so much! These company don't care about us, they couldn't give a crap! Yet others will defend them to the death it seems, we pay them bonkers amounts of cash to play GAMES, they should be defending them selves.

With better drivers, more optimisations for features are that brand dependant, more bloody stock etc.

PC gaming and benchmarking brings us all together, were a console fanboy to post how his system is the best we'd all join hands regardless of gpu in knocking some sense into him.

Yet we argue over gpus endlessly, when we should be just talking about the pros and cons of both brands, because they both have them.

What's that..you have a 980ti? Awesome card and a beast overclocker! Hey you have a fury x don't you? Awesome card and you have that new hbm memory! Etc etc....
 
I find this "amd vs nvidia" so crazy, the rivalry, the hair works vs tress fx, why do we care so much! These company don't care about us, they couldn't give a crap! Yet others will defend them to the death it seems, we pay them bonkers amounts of cash to play GAMES, they should be defending them selves.

With better drivers, more optimisations for features are that brand dependant, more bloody stock etc.

PC gaming and benchmarking brings us all together, were a console fanboy to post how his system is the best we'd all join hands regardless of gpu in knocking some sense into him.

Yet we argue over gpus endlessly, when we should be just talking about the pros and cons of both brands, because they both have them.

What's that..you have a 980ti? Awesome card and a beast overclocker! Hey you have a fury x don't you? Awesome card and you have that new hbm memory! Etc etc....

Unfortunately for the majority who do act like this, it's the vocal minority that ruin this forum. Solution? Like my self and others have done, just find another forum to post on.
 
Reading the main article again, I genuinely see this as poor form. I did quite like Huddy and didn't care about his mishaps but this just stinks of desperation and even the stringent AMD fans must be holding their head in their hands reading it. I never bother getting hooked into the politics of ethics, as you would need to be massively squeaky clean to start a smear campaign but Huddy and AMD are not that squeaky clean. Lichdom is the last example I know that AMD could use TressFX and Nvidia users can't and AMD were the one's bleating on about open standards. Cringe worthy stuff really. They release new GPUs that basically suck and suck the money from your bank with the over pricing but then feel it is right to whinge and whine at Nvidia. Shameless actions and desperation.

AMD need to sort their own **** out before they start pointing the finger.
 
Doesn't anyone else think it's also tragic that he felt the need to bring up Crysis 2 anyway? A game that came out in 2011 and consisted of alleged overuse of tessellated objects present through the conventional DX11 pipeline and implemented by the developers. So basically AMD are indirectly (yet really very much directly) accusing Crytek of foul play.


All we hear is negative press about GameWorks when it's the only real consistency in driving PC graphics out of the console spectrum. Where is the alternative from AMD? The supposed open library pool that was thrown about on air by Richard still hasn't seen any real traction. So not only does that yet to exist (at least in the public eye), but we're still at the same crossroads we were 2 years ago whereby it is everyone else's fault.

The core SDK offers developers these tools and effects as an abstract layer so that they're easy to manipulate and implement - effects that otherwise wouldn't have been included in the game. Performance detriment is to be expected. Overuse of tessellation is something AMD users have been able to manipulate for sometime now, so I'm not sure why the song and dance need continue the way it has. Tess has been around for years and the ground work goes back even further, a lot of which is derived from AMD themselves. AMD's tess engine just isn't as powerful as NVIDIA. If these effects aren't game critical turn them off

In the short term AMD need to stop crying about this, and in the long term if it's hurting them improve their own technologies and resources. If you are an NVIDIA user (and there is an ever increasing chance that you are given the current market) these effects are more often than not - not too performance detrimental and offer greater quality than what would be there without. A lot of developers don't see the incentive to upgrade PC graphic fidelity over what they are working on for the lowest denominating platform. NVIDIA are offering this off the shelf. As a consumer, this isn't damaging. Is this damaging to AMD? Tessellation might well be but this isn't anything we didn't know already. There isn't a shred of proof to suggest that other effects within the core SDK do. AMD need to give people answers, instead of trying to get them outraged. It's difficult to say who they are trying to reach out to with these comments when a lot of users have heard them already - as if they expect their customer base to do something about it. DirectX 12 may very well with enough time dissolve the need for GW anyway, so AMD could wait for this to happen or wait for NVIDIA to stop injecting money into GW - which is another eventuality.

Nobody wants AMD's performance or their products to do badly. They just want them to do something! I went to buy 3 Fury X cards on launch day. Stock was gone within the hour and still do this day there is nothing. I managed to get one. Of which had considerable pump noise that I decided to overlook anyway as I did like the card. It's pleasing on the eye, and performance is there, especially in XDMA. XDMA is simply better than SLI at the moment - this is just a fact - at least when it does work and there is a profile to be used. Sadly there's still nothing on the shelves, and I ended up with three EK blocks I've still not used. How then - can AMD publicly make these farcical statements and expect anyone to take them seriously. Please prioritise and stop this nonsense.
 
Last edited:
Uhhhh people still bring up Crysis 2 because NV have never answered for it, just like they never have to answer for anything, people just keep mindlessly buying whatever new cards they fart out (GM204).
 
Should really have a poll with options on whether people like GameWorks or not.

1. Yes (AMD GPU)
2. No (AMD GPU)
3. Yes (Nvidia GPU)
4. No (Nvidia GPU)

People should then post their reasons why they think that.

Not sure it's a good idea judging how this thread is going. :p
 
Uhhhh people still bring up Crysis 2 because NV have never answered for it, just like they never have to answer for anything, people just keep mindlessly buying whatever new cards they fart out (GM204).

Why not blame the Dev rather then Nvidia and why do they have to answer to AMD?

Wouldn't it be better for AMD to focus on moving forwards then looking back?
 
ROyxLA0.png


Shame Huddy can't try hairworks :(

Q512mfX.jpg

CeRbi2T.jpg

;);)
 
The kepler cards such as the 780ti were faster than a 290X back in the early days but somehow they are 10-15% slower now in the latest games. Why is that when the 290X still manages to stay within 30-40% of a TitanX?

Let's hope we don't have just Nvidia cards to compare against in future otherwise a year old card will suddenly drop off in performance quickly when the next gen card is released...if you know what I mean..

From Kaapstad's Firetrike table we can see that the gpu score for Kepler Titan (780Ti Equivalent) which was faster than a 290X at launch is less than half the performance of the TitanX but the 290X is about 30% slower than a TX. Can anyone explain why especially since the Nvidia cards are better at tesselation.?

4 GPU Scores.


  1. Score 16755, GPU TitanX @1429/1977, GFX Score 18526, Physics Score 21519, Combined Score 8177, CPU 5960X @4.5, Kaapstad - Link Drivers 353.06
  2. Score 10980, GPU 290X @1220/1500, GFX Score 12217, Physics Score 21707, Combined Score 4392, CPU 5960X @4.5, Kaapstad - Link Drivers 14.12
  3. Score 10832, GPU 290X @1235/1500, GFX Score 12187, Physics Score 18143, Combined Score 4444, CPU 4930k @4.8, Kaapstad - Link Drivers 14.9
  4. Score 7335, GPU nvTitan @837/1502, GFX Score 7856, Physics Score 14488, Combined Score 3278, CPU 3930k @4.0, Kaapstad - Link Drivers 344.11

Don't forget he's ragging the 290x's with 1200+ overclocks while the original Titans are only clocked at 837. Was that stock?
 
Uhhhh people still bring up Crysis 2 because NV have never answered for it, just like they never have to answer for anything, people just keep mindlessly buying whatever new cards they fart out (GM204).

That's rich coming from you.... How is your Titan X btw?
 
Spot on.

You can always tell an AMD fanboy when they refuse to buy Nvidia GPUs because of "moral" reasons yet happily have their FuryX run with an Intel CPU on windows, drink coffee form Starbucks, eat nestle products, wear clothes produced with child labour, drive a VW/Ford/GM/Mercedes car, eat food chemically treated by Monsanto etc.
At the same time they are perfectly happy to buy form AMD who have been caught cheating in drivers (ATI were the originators of dirty driver tricks back in the 'quack2' days), lied to investors, lied to customers about the FuryX pump whine only affecting reviewer cards and then lied again saying the problem was fixed in retail, purposely withheld driver performance form the 290 series for months at time so AMD could further peddle their lie about the 390 not being a rebrand.


AMD and a nvidia are both businesses that try to make money. AMD's marketing is woeful, always trying to play the victim and blame evil nvidia for their own short comings. There is a very simple solution if AMD doesn't like game works, they can go to the game developers and help them implement the same features with TressFX or whatever. But that costs resources, it is much easier just to shout garbage over the internet and rial up the fanboys

It was quack3 but I do remember those days and the later "shimmering" AF issues both party's had when try and eek out a few more FPS
 
Wasn't the original issue/argument with Gameworks that it was closed dll files that couldn't be tweaked/optimized and integrated into the games better by the games developers, whereas the AMD equivalent is opened source and therefore accessible by the devs. I could be wrong, and don't particularly care, but this is why Nvidia get a hard time about Gameworks, it's not helping PC gaming in general just screws games over for everyone, even their own users on occasion.
 
Wasn't the original issue/argument with Gameworks that it was closed dll files that couldn't be tweaked/optimized and integrated into the games better by the games developers, whereas the AMD equivalent is opened source and therefore accessible by the devs. I could be wrong, and don't particularly care, but this is why Nvidia get a hard time about Gameworks, it's not helping PC gaming in general just screws games over for everyone, even their own users on occasion.

A quick google on Gameworks
https://developer.nvidia.com/what-is-gameworks
Many NVIDIA GameWorks components including tools, samples and binaries are freely available to all developers. For other binary or source code access please contact us at Game Works Licensing: [email protected]

Also to quote Tim Sweeney from Epic and the Unreal 4 Engine.
Ultimately, GameWorks is game middleware that’s written in code, just as Havok Physics, SpeedTree, Unreal Engine 4, Unity, and WWise are game middleware that’s written in code. Often middleware providers share their code with hardware vendors and invite them to provide feedback on optimization for their hardware or to write actual code for inclusion in the middleware. At Epic, we have often done this with key hardware partners including both NVIDIA and AMD.

However, there’s not a general expectation and certainly not an obligation that a middleware providers shares their code with hardware vendors, or accept optimizations back. There are legitimate reasons why they may choose not to, for example to protect trade secrets.

Nowadays, some game middleware packages are owned or supported by hardware companies, such as Intel owning Havok Physics, Nvidia owning PhysX and GameWorks, and AMD’s past funding and contributing to the development of Bullet Physics. Here IHVs are investing millions of dollars into creating middleware that is often provided to developers for free. It’s not necessarily realistic to expect that a hardware company that owns middleware to share their code with competing hardware companies for the purpose of optimizing it for their hardware, especially when hardware vendors are often involved in competing or overlapping middleware offerings.

Game and engine developers who choose middleware for their projects go in with full awareness of the situation. With Unreal Engine 4, Epic Games is a very happy user of Nvidia’s GameWorks including PhysX and other components.
 
Last edited:
Back
Top Bottom