• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Huddy on Gameworks round 3


The Witcher 3 Dev says Nvidia HairWorks Unoptimizable for AMD GPUs

Read more: http://wccftech.com/witcher-3-dev-nvidia-hairworks-unoptimizable-amd-gpus/#ixzz3hkbAOqUb

It's not straight forward when it comes to optimizing for Amd. Amd do seem to get there in the end but when the reviews and performance charts are released the damage is already done as Amd are usually well behind Nvidia. This is the big advantage Gamework's gives to Nvidia.
 
The Witcher 3 Dev says Nvidia HairWorks Unoptimizable for AMD GPUs

Read more: http://wccftech.com/witcher-3-dev-nvidia-hairworks-unoptimizable-amd-gpus/#ixzz3hkbAOqUb

It's not straight forward when it comes to optimizing for Amd. Amd do seem to get there in the end but when the reviews and performance charts are released the damage is already done as Amd are usually well behind Nvidia. This is the big advantage Gamework's gives to Nvidia.

Yet now there are Hairworks options for AMD users, in the game, and since AMD's own driver updates have significantly reduced the performance loss.
It's a bit hard to optimise for AMD when it's the hardware that's lacking ( tessellation performance ).

2 Fury X's can match and beat SLI 980Ti with Hairworks maxed in the Witcher 3 at 1440p in parts.
Let's also not forget that AMD's tessellation processing is already lacklustre compared to NVIDIA's.

Even without adjusting tess amounts in CCC, with current AMD drivers, and Witcher 3 patches AMD cards do much better.

AMD are behind because in parts their hardware is lacking and in others they're not working with developers.

In the case of Project Cars they stopped responding to the developers last year, despite the devs trying to get in contact with them. As such AMD "couldn't" optimise their drivers, and the developers couldn't do the same for the engine and their hardware.


http://wccftech.com/slightly-mad-st...working-fix-performance-issues/#ixzz3hkeNTUMB

http://www.kitguru.net/components/g...-dev-responds-to-amd-performance-accusations/
They were invited to work with us for years, looking through company mails the last I can see they (AMD) talked to us was October of last year.

Magically AMD suddenly had drivers to fix everyone's woes. Turned out they withheld these drivers to make the 3xx series look better and sell. Many people modified the drivers to run on the 200 series and got amazing performance.
 
Last edited:
Uhhhh people still bring up Crysis 2 because NV have never answered for it, just like they never have to answer for anything, people just keep mindlessly buying whatever new cards they fart out (GM204).

Ask Roy Taylor, he worked at NVIDIA at the time. I tried and he ignored the question. Highly tessellated water under the level map isn't something NVIDIA would have done by themselves, nor are any of the tess factors within the game. It's all a direct result of source code written by Crytek.


Also yes people tend to buy cards no matter how they're 'farted out' if they're quicker than what they have by a considerable margin. It's not NVIDIA's problem if there are some people who sit in their chair and snigger purely because someone gave them a glimpse of a bigger die on the horizon. Doesn't make you any less of an idiot just because you've seen someone has linked a shipping manifest for a product not yet to launch for 5 months.
 
Last edited:
I find it funny how typical Nvidia shills start attacking Huddy's character instead of his claims. Is what he saying wrong? Clearly not as has been evident over the past year with multiple Nvidia screw-ups (970 debacle, Kepler issues, gameworks games issues, etc) but which brown-noses as the first reply seem to so vehemently defend, even though you can bet that they don't get paid for it. And contrary to their own self-interest even, because they're helping the company get away with making such poor decisions without questioning them. But what can you do, ey?
 
I find it funny how typical Nvidia shills start attacking Huddy's character instead of his claims. Is what he saying wrong? Clearly not as has been evident over the past year with multiple Nvidia screw-ups (970 debacle, Kepler issues, gameworks games issues, etc) but which brown-noses as the first reply seem to so vehemently defend, even though you can bet that they don't get paid for it. And contrary to their own self-interest even, because they're helping the company get away with making such poor decisions without questioning them. But what can you do, ey?

:D
 
The kepler cards such as the 780ti were faster than a 290X back in the early days but somehow they are 10-15% slower now in the latest games. Why is that when the 290X still manages to stay within 30-40% of a TitanX?

Let's hope we don't have just Nvidia cards to compare against in future otherwise a year old card will suddenly drop off in performance quickly when the next gen card is released...if you know what I mean..

From Kaapstad's Firetrike table we can see that the gpu score for Kepler Titan (780Ti Equivalent) which was faster than a 290X at launch is less than half the performance of the TitanX but the 290X is about 30% slower than a TX. Can anyone explain why especially since the Nvidia cards are better at tesselation.?

4 GPU Scores.


  1. Score 16755, GPU TitanX @1429/1977, GFX Score 18526, Physics Score 21519, Combined Score 8177, CPU 5960X @4.5, Kaapstad - Link Drivers 353.06
  2. Score 10980, GPU 290X @1220/1500, GFX Score 12217, Physics Score 21707, Combined Score 4392, CPU 5960X @4.5, Kaapstad - Link Drivers 14.12
  3. Score 10832, GPU 290X @1235/1500, GFX Score 12187, Physics Score 18143, Combined Score 4444, CPU 4930k @4.8, Kaapstad - Link Drivers 14.9
  4. Score 7335, GPU nvTitan @837/1502, GFX Score 7856, Physics Score 14488, Combined Score 3278, CPU 3930k @4.0, Kaapstad - Link Drivers 344.11



I wouldn't read too far into 3d mark scores but TitanX at those clocks nowadays is scoring a lot higher than it did then. Likewise the 290x has been brought up over time with drivers and 780ti not so much.

I would guess because AMD don't have the money to churn out as many cards as Nvidia do they put more resources into trying to eek out more performance on what they had. Just compare original 290x benchmarks to the 390x (minus a bit for the clock speeds) reviews to give an idea.

For Nvidia now, the 780ti is old news. Look at the 970 scores that's where it should be at.
 
Last edited:
I find it funny how typical Nvidia shills start attacking Huddy's character instead of his claims. Is what he saying wrong? Clearly not as has been evident over the past year with multiple Nvidia screw-ups (970 debacle, Kepler issues, gameworks games issues, etc) but which brown-noses as the first reply seem to so vehemently defend, even though you can bet that they don't get paid for it. And contrary to their own self-interest even, because they're helping the company get away with making such poor decisions without questioning them. But what can you do, ey?

Provide proof. You're fitting a stereotype of saying 'GW issues' without really understanding what you're saying. What issues does GW bring? Break it down for me. No vague media statements either...I would wager you can't come up with anything that is a direct result of GW other than poor tessellation performance.

I don't think anyone is attacking his character, either.
 
Who's attacked his character?
He was asked a question about gameworks and his answer was something about Crysis 2, which was released before the suite of gameworks tools even existed.
Theres no need to attack his character when he cant even form a coherent argument.

If AMD wanted to do positive advertising, he could have just talked about supporting open standards, there was no need to say the things he said and it just look unprofessional.

NVIDIA have 80%+ market share, you don't win people back by whinging about the competition, you win them back by talking up the good points of your own products and showing real results.
 
Last edited:
from the last time there was a thread about this i dont think many people really care about gameworks anyway

but i guess it only takes one good game
whats the next big gameworks game?^^;
 
Some people will defend nvidia to the hilt Gerard. :)

They offer the best gaming experience that's why, if AMD had their way NVidia will be as lazy as they are and we'll all lose rather than just AMD owners.

Huddy claiming that GameWorks is hurting NVidia's older cards well that's just the price of progress, something AMD haven't made in about 4yrs when it comes to their primary bugbear that is tessellation, it's reached a point where they are now cheating in drivers to obtain satisfactory performance. GTX970/980 are considerably better architectures than Kepler despite what most people seem to think so understandable Kepler is struggling more in recent games.

The state of play is pretty much this:

- Developers develop games to the DirectX spec, sometimes they put in time optimising different vendor GPU's sometimes they don't. Their responsibility is and always has been to make DirectX compatible games, nothing more.

- NVidia go out of their way to ensure the majority of games are 100% optimised for their GPU's and some even include middleware they have developed.

- AMD don't do any of that except on a tiny number of sponsoreed games or AAA titles (GTA V), instead they sit inside their HQ portraying what NVidia is doing as a bad thing because they don't have the same level of commitment to their customers.

- AMD attempted to offload the responsibility of optimising for their GPU's to developers by developing Mantle, developers who weren't paid to promote it were understandably cold towards it because it just meant more work for them with zero benefit.
 
Last edited:
I personally don't like gameworks!

From my experience it seems that the majority of games its been in the game has ran or runs poorly, this is usually noticeable on release. And this is for both AMD and nvidia but usually more so for AMD but this is down to AMD cards not been as strong as nvidia when it comes to tessellation.

Reason why i don't like gameworks is because it's not optimized for AMD cards and nvidia have claimed this, its upto AMD to optimize. This is difficult for AMD to do as they are running blind in games which incorporate gameworks and takes a vast amount of time for them to do so (Lets be honest AMD are not quick on driver updates). Its upto the dev to optimize gameworks for AMD cards really, which usually never happens as devs leave most of the optimisation to the driver teams. The devs are the ones with the most control and ability to optimise gameworks for either AMD or nvidia.

Gameworks is usually already optimized some what for nVidia cards and gets further optimization from the devs usually due to their bound contract to use gameworks. However it dosn't explain why their games end up running poo. Also does not explain why the games usually are over tessellated, which also affects older nvidia cards performance aswell as AMD cards.

When i see gameworks in games with it on and off it does add that little bit extra detail/graphical candy weather it be realistic or not but usually i don't think IMO its worth the performance cost to use it. But a lot of devs like gameworks as its less work for them to develop or use their own graphical features so they use it.

End result usually is game looks a little bit better to the eye (eye candy) at a great cost the performance anywhere from 10% to greater lost FPS when used. This is usually more for AMD cards. And a majority of the times the games run or rather don't really run that well on release. So this is from my experience with gameworks and the reason i don't like it. And i see a lot of people not actually fussed about gameworks.

sleepy sorta rant over.
 
personally i think it will just turn more people off AAA titles
well until the effects are used good & worth the fps hit
which hasnt happened yet for me...
maybe black flag? not sure

AMD shouldnt care on these games so much, support the smaller games that often become crazy popular anyway
they just banging their head against a wall
 
I find it funny how typical Nvidia shills start attacking Huddy's character instead of his claims. Is what he saying wrong? Clearly not as has been evident over the past year with multiple Nvidia screw-ups (970 debacle, Kepler issues, gameworks games issues, etc) but which brown-noses as the first reply seem to so vehemently defend, even though you can bet that they don't get paid for it. And contrary to their own self-interest even, because they're helping the company get away with making such poor decisions without questioning them. But what can you do, ey?

What, is English not your native language? Huddy's rants have been dis-proven countless times, by the very game developers no less.

E.g., Huddy's previous rant about Project CARS which the fanboys lap up like a preacher offering an eternity in heaven.
http://wccftech.com/slightly-mad-studios-issues-official-statement-project-cars-amd-graphics-cards/
 
The Witcher 3 Dev says Nvidia HairWorks Unoptimizable for AMD GPUs

Read more: http://wccftech.com/witcher-3-dev-nvidia-hairworks-unoptimizable-amd-gpus/#ixzz3hkbAOqUb

It's not straight forward when it comes to optimizing for Amd. Amd do seem to get there in the end but when the reviews and performance charts are released the damage is already done as Amd are usually well behind Nvidia. This is the big advantage Gamework's gives to Nvidia.



That is because no-one can optimize away AMD"s poor tessellation performance. That is a hardware issue that AMD will need to resolve in the future.



Source code makes absoltue;ly no difference. For a vast majoirty of games Nvidia and AMD don;r get any source code full stp, and they don't need it. Game developers will send a pre-release binary to Nvidia/AMD where they can reverse engineer the draw calls because guess what, eery single draw call emanating form the game has to go through the friggin driver that nvidia/AMD wrote. It makes absolutely no difference if that draw call was a result of gameworks, tressFX or something the developer wrote.



If AMD are too incompetent to optimize their divers without seeing source code then they simply should not be in the business of selling GPUs and they should do the honorable thing and hand over the reign to a company that has more than a brain-dead Capuchin monkey at a keyboard.
The facts is that is pure lies, AMD are very definitely capable of optimizing gameworks titles without gamesworks source code, and they can do that long before the game is ever launched by working with the game developer. But here is the thing,AMD don't, they simply don't spend the resources and don't manage developer relationships to anywhere near the extent that NVidia does. Instead they find it much cheaper to shove Huddy on stage to spout PR nonsense that the AMD diehards can regurgitate on forums.
 
personally i think it will just turn more people off AAA titles
well until the effects are used good & worth the fps hit
which hasnt happened yet for me...
maybe black flag? not sure

AMD shouldnt care on these games so much, support the smaller games that often become crazy popular anyway
they just banging their head against a wall

It is the AAA games that are going to be less likely to use gamesworks features in the future because they have the resources to develop those technologies themselves.

Middleware systems like Gamesworks will become more and more popular with smaller indie devs who don't have the resources to implement such features and who would gain the most form a big helping hand.


this is going to be even more of the case with DX12, a low-level API may be desirable by the top few engine developers but it is a big step backwards for many of the smaller outfits because it will increase development costs. Middleware platforms will become ever more important for indies.

Even without DX12 this is the natural direction to try to counter the ever increasing costs of game development.
 
some will no doubt
but a lot of indie developer have more pride in their work lol
they dont have the hype to sell so their games need to be good, or atleast, work & hopefully fun!

if they got free advertising for it im sure some would tho^^;
 
some will no doubt
but a lot of indie developer have more pride in their work lol
they dont have the hype to sell so their games need to be good, or at least, work & hopefully fun!

if they got free advertising for it im sure some would tho^^;

Some also don't give a **** at all, just look at the vast majority of them. Look up Slaughtering Grounds and their developer Digital Homicide to see a prime example.

Also one Indie game I know of that is using Gameworks is Ark: Survival Evolved.
 
some will no doubt
but a lot of indie developer have more pride in their work lol
they dont have the hype to sell so their games need to be good, or atleast, work & hopefully fun!

if they got free advertising for it im sure some would tho^^;

Its got nothing to do with pride, and if it was they absolutely would be using gamesworks or other middleware software to improve their games.


Gamesworks effects are actually highly regarded by developers, which is why they are being used because it is beyond the capabilities of the developers under their resource and time constraints. You can't just whip up a decent physics engine or facial animation code in a Friday afternoon before heading to the pub. A lot of solid engineering has gone into gamesworks andit shows.
 
Back
Top Bottom