• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Gameworks at Gamescom 2015

Planned to and doing so are 2 different things and when i tested crysis 2 which spawned this feature there was no performance difference between AMD optimized and 64 and at the end of the day if there is no visual loss then its a plus in efficiency not a negative.

I do not consider culling unseen geometry as cheating, its efficiency.

And that's the end of that.
Oh, I thought you had some actual proof. Silly me.
 
You don't need to use Libraries, the grass you see in my screenshot

Says you don't need to use libraries.

Uses middleware libraries to prove his point.
0q6D6yn.gif
 
Oh, I thought you had some actual proof. Silly me.

My proof is more than anything you have, i tested all the factors.

If there is no visual difference and there is no benefit in performance then there would no point in it being optimized, so the fact is 32,16 and 8 showed performance benefits over 64 and that AMD optimized did not means there is no profile for Crysis 2.

Now that i have explained it, i will agree to disagree with your opinion of proof.
 
Last edited:
So you claim that one game has no profile, therefore no games do, even though AMD have explicitly stated that their intended use of the feature was per game tessellation profiles?

Not that I'm liable to take your word for it anyway, given obvious bias'

Edit: Thinking about it, either AMD do have per game tessellation profiles to reduce the tessellation factor, or they have an option in their drivers which does absolutely nothing. Which is it?
 
Last edited:
There is one thing that i will concede, these middle-ware tools have been around for an age and yet few developers have bothered to pull their own fingers out of their asses with it.

Blender Bullet Physics



Those are some seriously impressive effects and arguably better looking than Physx. If devs are not using these open source libraries but flocking towards Gameworks it can only mean one thing..They are getting paid to use Gameworks....
 
So you claim that one game has no profile, therefore no games do, even though AMD have explicitly stated that their intended use of the feature was per game tessellation profiles?

Not that I'm liable to take your word for it anyway, given obvious bias'

Well seeing as you are just as biased what's the point in talking at all and seeing as if there was ever a game to demonstrate AMD optimized it would have been that game that brought about the tessellation option in the first place and they didn't bother to optimize it then and nothing i have tested thus far has shown any benefit when choosing AMD optimized over 64 which makes the point moot and even if they ever do if there is no visual difference then again it makes the point moot, if it degrades quality then that's what users care about and until AMD optimized shows degradation in quality to improve performance then its a non issue and im done discussing it.
 
[/SPOILER]

Those are some seriously impressive effects and arguably better looking than Physx. If devs are not using these open source libraries but flocking towards Gameworks it can only mean one thing..They are getting paid to use Gameworks....

Agree on all points.
 
Well seeing as you are just as biased what's the point in talking at all and seeing as if there was ever a game to demonstrate AMD optimized it would have been that game that brought about the tessellation option in the first place and they didn't bother to optimize it then and nothing i have tested thus far has shown any benefit when choosing AMD optimized over 64 which makes the point moot and even if they ever do if there is no visual difference then again it makes the point moot, if it degrades quality then that's what users care about and until AMD optimized shows degradation in quality to improve performance then its a non issue and im done discussing it.

I quoted what AMD have said and intended to do, they have even gone as far as to put the option in their drivers. So either they do have per game tessellation settings that reduce the rate from what the developers intended, or they have a button that does nothing.

You have loaded up one game and gone, 'well it kinda looks the same bruv'.

Now, to go off on a tangent to all this. What ever happened to AMD's own gameworks style programme they made a big song and dance about a while back?
 
Last edited:
I quoted what AMD have said and intended to do, they have even gone as far as to put the option in their drivers.

And no i didn't just load up and well it kinda looks the same bruv, dont even try to tell me what i did.

What you quoted is moot as i read it years ago and i was fully aware of there intentions, what they intended did not materialise, which is no different than the Vsync setting in CCC that now does nothing for DX games where it once did and the only one going on a tangent is you because its all besides the point and a moot one at that and you are screaming proof and im not interested in joining your tangent or carrying on with your merry go round.;)
 
Last edited:
Finally watched the OP's video. All looks pretty nice... but.... too much fluffy hair. A bear's hair isn't wavvy extra... it's matted and slick. It doesn't look natural. I know this is new tech and it's showing it off though. Just feels very much like 3D films when they went out of the way to make 3D in your face rather than more natural like it was in Avatar.

Anywho, the point humbug was trying to make I'm sure is that all this could be achieved without gameworks. The difference is it locks in propriety tech with gameworks which is not necessarily a good thing, especially for competition. The PC market needs to be unified with gpu acceleration not seperated making it more difficult for developers.

Really liked the physics demo and the lighting stuff is pretty cool.
 
I'm just fed up of the entire nVidia/AMD carp. I've found the whole FuryX launch a farce - my plan to replace my 2x290's is now on indefinite hold. Every time I think about how much I paid for my 980ti's, I swallow a bitter pill and yes, I was one of many plagued by the God awful driver issues with Witcher 3 - ended up playing it on my crossfire rig which worked fine once AMD crawled out of their hole and released some drivers for it.

Gameworks I find completely underwhelming for the most. As someone so aptly put it earlier, it either doesn't look right or doesn't justify the performance hit - in fact the only effect I have ever liked and thought looked decent was TressFx in Tomb Raider.

Bottom line - I'd burn my PC and dance naked around it if I could return in 12 months to find a competitive 3rd player in the GPU market because it desperately needs it.
 
Last edited:
[/SPOILER]

Those are some seriously impressive effects and arguably better looking than Physx. If devs are not using these open source libraries but flocking towards Gameworks it can only mean one thing..They are getting paid to use Gameworks....

Oh of course that is the only possible explanation. :rolleyes:
 
My proof is more than anything you have, i tested all the factors.

If there is no visual difference and there is no benefit in performance then there would no point in it being optimized, so the fact is 32,16 and 8 showed performance benefits over 64 and that AMD optimized did not means there is no profile for Crysis 2.

Now that i have explained it, i will agree to disagree with your opinion of proof.

Get over it. Nobody cares about increased tess factors. You can adjust it from the control panel now regardless. Crysis 2 was before GW, meaning all tess factors were controlled entirely from source code written by Crytek.

Get...over it. lol

What may help you is if you envisage all this speculatory proof into a big box and call it "NVIDIA aren't bothered about AMD performance".
 
Last edited:
There is one thing that i will concede, these middle-ware tools have been around for an age and yet few developers have bothered to pull their own fingers out of their asses with it.

Blender Bullet Physics




Blender is composed entirely in Python and OpenGL meaning all bodies need to be exported and reimported into DX11 which for physics simulations may be a great deal of work.


By all means google some more though.
 
And no i didn't just load up and well it kinda looks the same bruv, dont even try to tell me what i did.

What you quoted is moot as i read it years ago and i was fully aware of there intentions, what they intended did not materialise, which is no different than the Vsync setting in CCC that now does nothing for DX games where it once did and the only one going on a tangent is you because its all besides the point and a moot one at that and you are screaming proof and im not interested in joining your tangent or carrying on with your merry go round.;)

Ah yes, the fingers in ears approach. Is anyone at all surprised?
 
I quoted what AMD have said and intended to do, they have even gone as far as to put the option in their drivers. So either they do have per game tessellation settings that reduce the rate from what the developers intended, or they have a button that does nothing.

You have loaded up one game and gone, 'well it kinda looks the same bruv'.

Now, to go off on a tangent to all this. What ever happened to AMD's own gameworks style programme they made a big song and dance about a while back?

Maybe AMD wouldn't need to reduce tessellation settings if they had access to the source code for Gameworks? They could then optimize better at a low level rather than just tell the card to reduce tessellation levels.

Some fanboys are criticizing them for the tessellation option in CCC which is absurd when they have no other choice due to Nvidia policies.
 
I quoted what AMD have said and intended to do, they have even gone as far as to put the option in their drivers. So either they do have per game tessellation settings that reduce the rate from what the developers intended, or they have a button that does nothing.

You have loaded up one game and gone, 'well it kinda looks the same bruv'.

Now, to go off on a tangent to all this. What ever happened to AMD's own gameworks style programme they made a big song and dance about a while back?

Fair point and I looked to see if AMD had made a retraction or said anything else but nothing, so are we to assume AMD are lowering tessellation to get better frames via profiles or are AMD not delivering on what they said they would do?
 
1)Get over it. Nobody cares about increased tess factors. You can adjust it from the control panel now regardless. Crysis 2 was before GW, meaning all tess factors were controlled entirely from source code written by Crytek.

Get...over it. lol

2) What may help you is if you envisage all this speculatory proof into a big box and call it "NVIDIA aren't bothered about AMD performance".

1) Talk about comprehension fail, of course it can be controlled from the control panel which i have already said many times and we all knew that for years and there is nothing to get over because im not the one complaining about the feature and i knew everything you said already, so logic fail on your part.

2)i dont care that is the thing, im not the one moaning about tess control, my testing years ago was for myself and so far AMD optimized has not done a damn thing, so i set it to 32x and 16x depending on the game and its all good. :)
 
Last edited:
Maybe AMD wouldn't need to reduce tessellation settings if they had access to the source code for Gameworks? They could then optimize better at a low level rather than just tell the card to reduce tessellation levels.

Some fanboys are criticizing them for the tessellation option in CCC which is absurd when they have no other choice due to Nvidia policies.

lol having access to NVIDIA's libraries isn't as big a deal as AMD want you to think it is. Not even the developers have access to this depending on the licencing agreement. It's AMD's pipeline that lets it down when it comes to hardware tessellation. This isn't exactly not well known.

What some 'fanboys' need to realise is that using too little tessellation is probably more detrimental than using more. Sometimes there isn't enough tess factoring available within DX11 to represent the displacement you want on a particular object, just as much as sometimes overuse can be wasteful.

The thing about tessellation also, is that it's entirely down to the developer still where and how it is applied. Even with GameWorks. Take static objects and geometry, it's not NVIDA who are applying tessellation to any particular mesh, this is the developers. There is so much misinformation surrounding GameWorks, and frankly it's only due to the fact that AMD are making such a big deal about it.

Not giving access to source is NOT something new or exclusive to this particular case. This is common practice, and quite frankly insinuating that a company should do this is tyrannical. But they (AMD) are the good guys in all this...apparently.
 
Back
Top Bottom