• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPR W3 on Hairworks(Nvidia Game Works)-'the code of this feature cannot be optimized for AMD'

Let's recap a little, what exactly is it you AMD guys are not happy about ?

Because HairWorks uses line tessellation / geometry that isn't ideal for AMD cards in the first place becausae tessellation is increased and the potential size of those fragments (polygons) get smaller. This is where AMD cards fall off a cliff because their pipeline is crap.

So the defence presented to you is basically along the lines of "it's too much please stop you're doing it on purpose. Mummy tell him."

And as you can see from above, he doesn't like having his religious beliefs crushed.
 
Because HairWorks uses line tessellation / geometry that isn't ideal for AMD cards in the first place becausae tessellation is increased and the potential size of those fragments (polygons) get smaller. This is where AMD cards fall off a cliff because their pipeline is crap.

So the defence presented to you is basically along the lines of "it's too much please stop you're doing it on purpose. Mummy tell him."

And as you can see from above, he doesn't like having his religious beliefs crushed.

Honestly Silent Scone, if thats all it is, thats ****ing pathetic.
 
That and seemingly people think there would be some huge gains in performance by giving AMD access the GW source code, when in actual fact ultimately once the geometry is processed it's massively down to the hardware, and unfortunately over to you AMD...
 
It's quite bullyish behaviour. It's like the kid at school being jealous of someone else having something they dont have, then bullying them to get it.
 
I wouldn't mind running an AMD r9 right now.. I would have control over this "issue" and as long as tessellation is the only thing affecting performance i wouldn't care one bit imho. I got my tess slider and i could give nVidia the finger IF!! this was really their plan to gimp through overuse of tessellation.

I just cant believe that if it were the case that no one at nvidia wasnt informed of the tess slider i CCC..I would think they keep pretty good tabs on eachother. So heres my personal theory to what has happened.

CDPR has been working on witcher 3 for a long time including implementation of what is now known as hairworks but an older build. This build was not running the same amount of tessellation and/or feature set as the new build they recieved 2 months before release. The newest build was only properly play tested on the gtx 900 series cards. This is all based on what everyone (AMD, CDPR and nvidia) has said so fare and assuming the dates they claim are true.

So if i were to play the stupid blame game i would say that the game dev(CDPR) has the main responsibility for what goes into their game. They made the choice to include Gameworks and it doesnt matter if nvidia upped the tessellation factor with the intent to gimp the performance or not.

But im also surprised that nVidia has not included a control for the amount of tessellation in their control panel yet considering AMD has it. I thought they pride themselves on offering "more" or atleast the same amount of features as AMD.

And for the last bit, if nVidia really did up the performance requirements on the sole purpose to gimp everything that isnt maxwell based well sure that would be classified as a d*ck move. One could spin it to make some sense even, seeing how gullible a lot of current nVidia users are as they happily would upgrade to the newest geforce even after being told and proven to that their beloved company's own tech is the reason behind the major performance drop.

In the end its just a game, its the easiest way to look at it without getting heartfailure from stress. This will continue to happen on and off nVidia and AMD and who ever else makes videocards in the future.

TLDR; Dont preorder, Wait for independent pc reviews, vote with your wallet. Live a more worry free life this way.
 
The kid at school just wants a fair situation where games have an equal opportunity to run well on Hardware from ALL vendors.
 
I wouldn't mind running an AMD r9 right now.. I would have control over this "issue" and as long as tessellation is the only thing affecting performance i wouldn't care one bit imho. I got my tess slider and i could give nVidia the finger IF!! this was really their plan to gimp through overuse of tessellation.

I just cant believe that if it were the case that no one at nvidia wasnt informed of the tess slider i CCC..I would think they keep pretty good tabs on eachother. So heres my personal theory to what has happened.

CDPR has been working on witcher 3 for a long time including implementation of what is now known as hairworks but an older build. This build was not running the same amount of tessellation and/or feature set as the new build they recieved 2 months before release. The newest build was only properly play tested on the gtx 900 series cards. This is all based on what everyone (AMD, CDPR and nvidia) has said so fare and assuming the dates they claim are true.

So if i were to play the stupid blame game i would say that the game dev(CDPR) has the main responsibility for what goes into their game. They made the choice to include Gameworks and it doesnt matter if nvidia upped the tessellation factor with the intent to gimp the performance or not.

But im also surprised that nVidia has not included a control for the amount of tessellation in their control panel yet considering AMD has it. I thought they pride themselves on offering "more" or atleast the same amount of features as AMD.

And for the last bit, if nVidia really did up the performance requirements on the sole purpose to gimp everything that isnt maxwell based well sure that would be classified as a d*ck move. One could spin it to make some sense even, seeing how gullible a lot of current nVidia users are as they happily would upgrade to the newest geforce even after being told and proven to that their beloved company's own tech is the reason behind the major performance drop.

In the end its just a game, its the easiest way to look at it without getting heartfailure from stress. This will continue to happen on and off nVidia and AMD and who ever else makes videocards in the future.

TLDR; Dont preorder, Wait for independent pc reviews, vote with your wallet. Live a more worry free life this way.


There is a tessfactor limitation - NVIDIA can't surpass this as it's controlled by DirectX. The maximum amount of tessellation that can be applied is 64x. This would have been applied in the geometry source data.

So Developer > HSinput > Tessfactor 1-64x > tessellator >output.

Developers problem. In fact the changes in 1.04 might reduce tessellation.
 
Last edited:
There is a tessfactor limitation - NVIDIA can't surpass this as it's controlled by DirectX. The maximum amount of tessellation that can be applied is 64x. This would have been applied in the geometry source data.

So Developer > HSinput > Tessfactor 1-64x > tessellator >output.

Developers problem. In fact the changes in 1.04 might reduce tessellation.

I will admit this instance i have no idea how DirextX works on the inside. My thoughts are based on my own logic and of course from time to time im wrong :) no shame admitting such things. That said though patch 1.04 did nothing for my performance and again if AMD is able to control the tessellation factor through their control panel so should nvidia imho
 
Neither did I which is why I decided to read up on DX11 tessellation process and ask someone who's an actual DX programmer. That's half the battle, wading through people going backwards and forwards. And TBH I've nothing against what you've contributed since this started - for example you were one of the first people I read who actually rolled back their drivers to test performance. Something that a lot of people didn't even bother trying before screaming bloody murder :)
 
Neither did I which is why I decided to read up on DX11 tessellation process and ask someone who's an actual DX programmer. That's half the battle, wading through people going backwards and forwards. And TBH I've nothing against what you've contributed since this started - for example you were one of the first people I read who actually rolled back their drivers to test performance. Something that a lot of people didn't even bother trying before screaming bloody murder :)

There's a lot of people just screaming and having a pop and actually dont even have the game lol. Like moaning about something that clearly is a deflection of AMD's responsibilities, having slap head Huddy come out with ****ing drivel then the minions going along with it as they have **** all better to do with their time. Their cards are ****e, the company is ****e but dont accept that, try and pin it on someone else.
 
There's a lot of people just screaming and having a pop and actually dont even have the game lol. Like moaning about something that clearly is a deflection of AMD's responsibilities, having slap head Huddy come out with ****ing drivel then the minions going along with it as they have **** all better to do with their time. Their cards are ****e, the company is ****e but dont accept that, try and pin it on someone else.

I dont agree with you saying the cards are bad. I think the Radeon r9s are fantastic cards and gameworks aside actually gives the 980 a run for its money at only 60% of the price specially the higher you turn the resolution. Yes the cards have one weak point in the performance area that is tessellation but the cards are also 2½ years old. Im pretty sure now that tessellation is going to be a bigger part of games that are coming out the company is going to correcting that weakness.
 
They're not bad at all, but the spin surrounding it is massive. It's been blown way out of the water and I don't condone the way AMD have let people speak out on their behalf in such a manner that makes them seem desperate.

"Look, look what NV are doing - this isn't right, it's not fair on ourconsumers".

Last year Huddy made the argument that using line tessellation is 'extremely inefficient' but what he didn't hasten to add are the words "for our GPU, as our pipeline does not cope well when the polygon size reaches that point. NVIDIA handle this process better than we do".

If the people hosting the No BS podcast were quick enough, what they should have said is "well what is new?"

AMD/ATI used to even issue a warning on North Islands and other chipsets that high tessellation often resulted in smaller polygons and there came a point where finely detailed polygons of 'n' pixels in size resulted in a significant loss in performance.

Only now, they're seemingly (through a corporate body completely separate to their own) finding a way to pass the blame.

So in a toss up of which is worse, closed source graphics libraries which is nothing new at all in this industry (which even then is arguable as developers can gain access to source if they feel they neeed to) - or passing the buck for your own products shortfalls.

Blergh.
 
Last edited:
I dont agree with you saying the cards are bad. I think the Radeon r9s are fantastic cards and gameworks aside actually gives the 980 a run for its money at only 60% of the price specially the higher you turn the resolution. Yes the cards have one weak point in the performance area that is tessellation but the cards are also 2½ years old. Im pretty sure now that tessellation is going to be a bigger part of games that are coming out the company is going to correcting that weakness.

I completely agree with what you just said. But some are turning things around and blaming Nvidia. I think they get out of bed in the morning just to do this. Cant be healthy.
 
Trust you to take a synthetic score at face value more like. Not to mention is that the only thing you took out of that...

wow.

Graph trumps, aw YEAH. My favourite passtime.


High stakes?

up to and / or more efficient. Better?
 
Last edited:
Back
Top Bottom