• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPR W3 on Hairworks(Nvidia Game Works)-'the code of this feature cannot be optimized for AMD'

Can of worms opening time:

http://arstechnica.co.uk/gaming/201...s-completely-sabotaged-witcher-3-performance/

Supposedly,earlier builds of the game worked fine until two months before release the Hair works libraries were included which lead to the controversy now.Apparently, AMD did offer TressFX but by then it was supposedly too late.

Apparently? Offered? What's that, like a passing gesture?

Simply disable HairWorks, problem solved. Even NVIDIA users are resorting to doing this on some cards. Out of the numerous games that could have benefited from TressFX we have only one. You can't possibly argue that this is anyone other than AMD's fault. I actually think TressFX IS better than HairWorks, so why have we seen nothing of it?

Ultimately, though, there's an additional amount of time and cost attached to including two very different types of technology to produce largely the same effect. According to AMD's Huddy, the company "specifically asked" CD Projekt Red if it wanted to put TressFX in the game following the revelation that HairWorks was causing such a large drop in performance, but apparently the developer said 'it was too late.'"

Why is it always someone else's fault? The only argument really is should developers actively choose to have this libraries. See this is the silent part of the argument and why it's really a lesser of two evils. On the one hand you've got TressFX. The source is completely open, anyone can optimise for this code. But these things take time. Time is money, and you have these libraries that Nvidia can offer in conjunction with a financial partnership, with toolsets that make these technologies far easier to implement.


So yes, like the article states, these things do take time - and seemingly from the uptake of TressFX, and the lack of public shader libraries in general produced by separate entities - it's not financially viable in general for companies to implement these effects.
 
Last edited:
Ultimately, though, there's an additional amount of time and cost attached to including two very different types of technology to produce largely the same effect. According to AMD's Huddy, the company "specifically asked" CD Projekt Red if it wanted to put TressFX in the game following the revelation that HairWorks was causing such a large drop in performance, but apparently the developer said 'it was too late.'"

I bet the developer have absolutely no recollection of that discussion but obviously they aren't going to respond like the Project Cars developer did to his tight knit community, you could write a book on excuses with the stuff coming out of AMD after every game release.
 
Last edited:
you can blame nvidia all you want amd clearly want to shift the blame over to them.

Let's face it if they played nvidia at their own game this whole BS would have ended long ago.

good guys come last.

they should do what's best for their end users not play some stupid morality card
 
I bet the developer have absolutely no recollection of that discussion but obviously they aren't going to respond like the Project Cars developer did to his tight knit community, you could write a book on excuses with the stuff coming out of AMD after every game release.

I literally just read this.

http://www.tweaktown.com/news/45296...ds-amd-performance-sabotage-claims/index.html

This is what we need, developers coming out of the woodwork and laying down their piece of the pie.

Project Cars isn't part of the GW program, so it's a different case. But in any case we need less of the finger pointing and more doing.
 
Thought the CDR dev clearing up any confusion/Ndf denial and confirming the zero possibility of GW's optimization for AMD gpu's was clarity enough, but here we are.:D
 
I don't get all the bellyaching. Seems performance is about the same with the NV stuff turned off. Why should AMD expect to have equal performance if NV put the effort in and work with the developers to implement their IP. At least it's not as polarising as Mantle, gameworks at least offers something to everyone.

I see Huddy is running his mouth off again, perhaps they should have offered TressFX a bit sooner than a few months before release.
 
Last edited:
Funny when Huddy was at Nvidia he was a hero to a number here,but funny how times change.

But funny how some here,are still not saying anything about the terrible performance on Kepler cards which until late last year were the main cards Nvidia were selling at the high end and until the GTX960 was released were the volume cards this year. I still would expect more people are still on Kepler based cards than Maxwell ones, with plenty of new purchases in the last 9 months especially under the GTX970/GTX980 SKUs. Even the anaemic GTX960 is making cards like the GTX770 and GTX780 look dumb.

Yet those older cards are supposed to still have better tessellation performance than the equivalent AMD ones too. Its shown in many of the synthetic benchmarks.

Maybe,Nvidia needs to offer a tessellation reduction option for Kepler??

:p

"Expects circular responses from now onwards and extreme deflection."
 
Last edited:
I don't get all the bellyaching. Seems performance is about the same with the NV stuff turned off. Why should AMD expect to have equal performance if NV put the effort in and work with the developers to implement their IP. At least it's not as polarising as Mantle, gameworks at least offers something to everyone.

I see Huddy is running his mouth off again, perhaps they should have offered TressFX a bit sooner than a few months before release.

Because AMD are ******* spongers and expect everything for nothing, then when it does go pear shaped, blame everyone else for their incompetence.
 
I look forward to Batman Arkham Knight big time and I am sure this debate will be extended to that thread. A shame really and it seems Nvidia pushing on to get new effects that do look very good in games doesn't suit some people. I feel at times it is bad enough playing second fiddle to the consoles. If GameWorks effects don't run so well on your system, then just turn them off, as recommended by the AMD rep here.
 
Funny when Huddy was at Nvidia he was a hero to a number here,but funny how times change.

But funny how some here,are still not saying anything about the terrible performance on Kepler cards which until late last year were the main cards Nvidia were selling at the high end and until the GTX960 was released were the volume cards this year. I still would expect more people are still on Kepler based cards than Maxwell ones, with plenty of new purchases in the last 9 months especially under the GTX970/GTX980 SKUs. Even the anaemic GTX960 is making cards like the GTX770 and GTX780 look dumb.

Yet those older cards are supposed to still have better tessellation performance than the equivalent AMD ones too. Its shown in many of the synthetic benchmarks.



Maybe,Nvidia needs to offer a tessellation reduction option for Kepler??

:p

"Expects circular responses from now onwards and extreme deflection."

Kepler performance is a completely separate issue though. Although you're doing some deflecting yourself there.


Thought the CDR dev clearing up any confusion/Ndf denial and confirming the zero possibility of GW's optimization for AMD gpu's was clarity enough, but here we are.:D

Well, what can one say. If you're happy repeating failsafe corporate responses instead of discussing the issue properly there's no real reply to that comment, Tommy. Strange how seemingly that's been a trend since the new rules kicked in. Perhaps you have nothing to add then
 
Last edited:
Kepler performance is a completely separate issue though. Although you're doing some deflecting yourself there.

Yet,no one was ripping into him when he was at Nvidia or Intel doing his standard PR stuff at those companies were they?? Its not like Nvidia were not doing the same with their PR people. I remember a couple of years ago,Nvidia and AMD PR were having a ****ging match in one thread on Hexus,which was bloody hilarious.

I only posted that stuff with "supposedly" since its a not a solid claim,and he is known for being somewhat colourful at times.

But claims were made that AMD crappy tessellation,tessellation cheats,rubbish drivers and rubbish dev relations are the fault of this mess with W3. Yet,the Kepler cards have similar issues despite the better tessellation,better dev relations and better drivers. Yet,its all been hidden away.

However,I do believe AMD do need to get their arse in gear though with hardware and software since they are still stuck on an old product stack especially for mobile,and I fear more rebrands for the midrange which would be lame if we are still stuck with DX11 cards which are 3 years old. At least we can mock AMD for it instead of Nvidia for the G92 now LOL.
 
Last edited:
I saw this gif on the Nvidia forums showing the difference between AMD and Nvidia lol (the trolling is strong in those forums :D):
wVHJusH.gif
 
Last edited:
I dont understand why the game devs havent put in a slider to control tesselation, unless a) they never thought of it or b) were convinced by third party not to.
 
I dont understand why the game devs havent put in a slider to control tesselation, unless a) they never thought of it or b) were convinced by third party not to.

I don't think it's possible to work it that way because down the pipeline (I am referring to graphics pipeline) it will cause problems
 
Last edited:
That is a pre and post Nvidia involvement in a game shot, the awesome looking shot being the dev making a game on their own, the later being what happens after Nvidia get involved.


What people really seem to be failing to grasp is two fundamental things, beyond a certain amount of tessellation there is absolutely no iq increase. We're talking about very small pieces being drawn on screen, subdividing smaller pieces into two new even smaller pieces has a diminishing return, extremely quickly you can literally not tell the difference. One company absurdly seeing a way to win in synthetic benchmarks by adding extra tessellation hardware to deal with levels of tessellation that can't be seen but CAN hurt performance is absolutely and completely pointless. However it enables Nvidia to win some synthetic benchmarks which makes certain ignorant people believe it's magically going to be miles faster.

Lets not forget, Nvidia used this tessellation advantage to improve game IQ massively.... wait no, they used this perceived advantage to pay devs to over tessellate things such as flat bollard surfaces, and an ocean that is rendered UNDER the map where no user can see it. This is what massive amounts of over the top tessellation performance gets you, it brings such high IQ differences that instead of making better quality IQ options Nvidia paid devs to use massively higher settings than required which no users could actually see the benefit of.

Again, Nvidia users had lower performance in Crysis 2(amongst other games) to over tessellate hidden objects and flat surfaces. Nvidia to win an stupid contest hurt Nvidia owners performance to try and win one over on AMD.
 
Back
Top Bottom