• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD announces GPUOpen - Open Sourced Gaming Development

Hail nVidia!!! Sorry been watching agents of the sheild. I like the concept which GW offers but i dislike that it's mostly optimised for one side and not the other which hurts the performance. I also understand Nvidia don't have to omptimise for the opposition because well they are a rival company but in the end it hurts gamers more. Thats why i dont like it. Least there is options to turn GW effects off.
 
Hail nVidia!!! Sorry been watching agents of the sheild. I like the concept which GW offers but i dislike that it's mostly optimised for one side and not the other which hurts the performance. I also understand Nvidia don't have to omptimise for the opposition because well they are a rival company but in the end it hurts gamers more. Thats why i dont like it. Least there is options to turn GW effects off.

But there's often no alternative so if you don't want to use a gameworks effect you have to go without. That's no good, We want effects everyone can use with a minimum impact not some of us having the option to either use them and have low performance or go without altogether.
 
But there's often no alternative so if you don't want to use a gameworks effect you have to go without. That's no good, We want effects everyone can use with a minimum impact not some of us having the option to either use them and have low performance or go without altogether.

That's what im getting at. I like the idea of gameworks but from what ive seen the effects they offer never come with minimal impact on either sides of the fence with AMD taking the bigger impact usually.
Only on odd occasions have i see gameworks offer something that added some good effects other times it was just mehh at the performance cost. Such as the god rays in Fallout 4. It took a right hit and offered hardly anything visually.

But if this is the way developers are going to go... use nVidia gameworks instead of developing thier own effects or tech id expect they would have more time to develop thier games and optimise them, often not been the case.
 
Oh no, he's in a huff cos someone said something against Nvidia. Never saw it coming.

Still regardless of how many effects are in there or not, gameworks has produced a few titles that have been pretty poor on release regardless of whether you want to admit it or not. I'm not even saying they are all bad but there's a few clear examples.

Thats the kind of response I expect tbh when the truth is told.

GW doesnt produce titles, developers do. The only thing I will say going forward is that Nvidia have to chose their devs/sponsored games more carefully.
 
Thats the kind of response I expect tbh when the truth is told.

GW doesnt produce titles, developers do. The only thing I will say going forward is that Nvidia have to chose their devs/sponsored games more carefully.
lol, the truth, from the lamb.

Yes they do but they have to integrate the code towards the way Nvidia has coded there effects and as many are aware it's often because they are wanting the effects as quick and easy as possible so it's usually going to be straight off Nvidia's hands which is the poor performance. I'm guessing the games where it's not poor are the ones they've put more time into making sure it actually works.

Yeah cos all those awful devs like CD project Red, Crystal Dynamics, Rocksteady etc. Such awful developers, keep trumping up that truth my friend. I don't mind you liking gameworks as it's an okay bit of software but it's been rife with controversy from the get go and to blame plenty of good devs and assume it was entirely them while covering your eyes is a bit childish.
 
Last edited:
lol, the truth, from the lamb.

Yes they do but they have to integrate the code towards the way Nvidia has coded there effects and as many are aware it's often because they are wanting the effects as quick and easy as possible so it's usually going to be straight off Nvidia's hands which is the poor performance. I'm guessing the games where it's not poor are the ones they've put more time into making sure it actually works.

Yeah cos all those awful devs like CD project Red, Crystal Dynamics, Rocksteady etc. Such awful developers, keep trumping up that truth my friend.

For everyone of those there's an Iron Galaxy, Ubisoft, and Studio Wild Card. Ones where the games run badly and were buggy even without Gameworks' features enabled, or even in there.

It's only recently the likes of Ubisoft have been even trying to do better on the PC side; and that's with Gameworks in there. ( Didn't see much outcry for The Division beta from AMD users, except for the late hacking problem. )

There's no denying it's up to the developers to actually get out a proper working and well optimised game, whether they use any tech from AMD or NVIDIA. They're the ones that have to implement it after all.
 
Last edited:
For everyone of those there's an Iron Galaxy, Ubisoft, and Studio Wild Card. Ones where the games run badly and were buggy even without Gameworks' features enabled, or even in there.

It's only recently the likes of Ubisoft have been even trying to do better on the PC side; and that's with Gameworks in there. ( Didn't see much outcry for The Division beta from AMD users, except for the late hacking problem. )

There's no denying it's up to the developers to actually get out a proper working and well optimised game, whether they use any tech from AMD or NVIDIA. They're the ones that have to implement it after all.
I agree, I'm not slapping my chops and trying to cover my eyes like lambchop so I'm willing to admit there are some devs who could be to blame and some instances where it might not be Nvidia but it seems a bit odd to see there involvement cause some performance hits quite often and not connect the dots anywhere.

Yes but isn't the whole point of gameworks that they pay Nvidia to get these effects and have them in a well implemented way? Blaming the devs for Nvidia tech is a bit silly, they made it and they had the responsibility to make it reliable and worthwhile for gamers. I acknowledge devs do play a huge role how it implements into the game but when it's switching on the Nvidia effects that causes the biggest damage to performance (like the witcher) then that sounds more like Nvidia failed whereas cd project red made the rest of the game pretty damn fine. There's two sides to the coin so I'm not really going to debate this again as it's all he said she said but lets be honest, they paid nvidia to get working effects (no different to if you paid for the unreal engine and got a shoddy engine) so Nvidia have some hand in it too.
 
I agree, I'm not slapping my chops and trying to cover my eyes like lambchop so I'm willing to admit there are some devs who could be to blame and some instances where it might not be Nvidia but it seems a bit odd to see there involvement cause some performance hits quite often and not connect the dots anywhere.

Yes but isn't the whole point of gameworks that they pay Nvidia to get these effects and have them in a well implemented way? Blaming the devs for Nvidia tech is a bit silly, they made it and they had the responsibility to make it reliable and worthwhile for gamers. I acknowledge devs do play a huge role how it implements into the game but when it's switching on the Nvidia effects that causes the biggest damage to performance (like the witcher) then that sounds more like Nvidia failed whereas cd project red made the rest of the game pretty damn fine. There's two sides to the coin so I'm not really going to debate this again as it's all he said she said but lets be honest, they paid nvidia to get working effects (no different to if you paid for the unreal engine and got a shoddy engine) so Nvidia have some hand in it too.

Oh NV certainly do, but let's not forget Hairworks Version 1.0 while tanking performance did look great. Especially on animals and monsters as well. As an NVIDIA user I saw only a 10-15fps drop in the heaviest of cases, and it actually lined up really well with the performance hit TressFX 1.0 had on launch for cards; which sadly still hasn't been used on animals ( I wish CD managed it in the new Tomb Raider. )

0VXYeuU.png

The over reliance and amount of tessellation was entirely on NVIDIA though. That x64 amount was bonkers, and both Kepler and Hawaii GPUS folded in at that amount. It was only Maxwell and Fiji that could manage it.

Thank goodness CDPR managed to finally patch in a slider to adjust the amount, although that's also something they should have done before launch really. So both parties there have some blame, but NV has the majority.

Now considering HBAO+ though, which was considered a performance hog by quite a few people; it seems NV have finally managed to get out not only better visual fidelity with it, but also reduce the performance impact a lot. In fact, the performance impact shown in Rise of the Tomb Raider shows it making the 980Ti lose more than the FuryX.


We don't know however if it was simply the Ad Hoc solution implemented from NVIDIA, or if CD licensed the Source Code there and improved it themselves. It is worth noting though, that the latest Tomb Raider patch specifically mentioned improvements to HBAO+, so CD could very well have access to the source code.
 
Oh NV certainly do, but let's not forget Hairworks Version 1.0 while tanking performance did look great. Especially on animals and monsters as well. As an NVIDIA user I saw only a 10-15fps drop in the heaviest of cases, and it actually lined up really well with the performance hit TressFX 1.0 had on launch for cards; which sadly still hasn't been used on animals ( I wish CD managed it in the new Tomb Raider. )

0VXYeuU.png

The over reliance and amount of tessellation was entirely on NVIDIA though. That x64 amount was bonkers, and both Kepler and Hawaii GPUS folded in at that amount. It was only Maxwell and Fiji that could manage it.

Thank goodness CDPR managed to finally patch in a slider to adjust the amount, although that's also something they should have done before launch really. So both parties there have some blame, but NV has the majority.

Now considering HBAO+ though, which was considered a performance hog by quite a few people; it seems NV have finally managed to get out not only better visual fidelity with it, but also reduce the performance impact a lot. In fact, the performance impact shown in Rise of the Tomb Raider shows it making the 980Ti lose more than the FuryX.


We don't know however if it was simply the Ad Hoc solution implemented from NVIDIA, or if CD licensed the Source Code there and improved it themselves. It is worth noting though, that the latest Tomb Raider patch specifically mentioned improvements to HBAO+, so CD could very well have access to the source code.

What I've wondered about with The Witcher 3 is if Nvidia set the tessellation to x64, how have CDPR suddenly managed to add a slider? If it's technical or contractual how have they overridden it? If HairWorks always allowed for a slider then it's not Nvidia's fault it was at x64.
If the slider required a change by Nvidia then it shows that Nvidia will react to dev requests.
If Nvidia set it to x64 and then changed their mind, then that's quite odd on their part and you have to wonder where the x64 number was plucked from if they decided there was reason to change it.
 
What I've wondered about with The Witcher 3 is if Nvidia set the tessellation to x64, how have CDPR suddenly managed to add a slider? If it's technical or contractual how have they overridden it? If HairWorks always allowed for a slider then it's not Nvidia's fault it was at x64.
If the slider required a change by Nvidia then it shows that Nvidia will react to dev requests.
If Nvidia set it to x64 and then changed their mind, then that's quite odd on their part and you have to wonder where the x64 number was plucked from if they decided there was reason to change it.

There certainly are a lot of questions, and sadly I doubt we'll ever have answers unless a developer actually comes out and talks about the process of it all.

It wasn't even that long after the launch that CDPR started patching in settings for Hairworks into the game. Even the AA for it was adjustable then, so if they could add it all; why didn't it get put in before launch?

It could simply have been due to the deadline to get the game out; they already delayed it once before. Plenty of more performance patches came out afterwards as well.

4wUEnRJ.png
 
There certainly are a lot of questions, and sadly I doubt we'll ever have answers unless a developer actually comes out and talks about the process of it all.

It wasn't even that long after the launch that CDPR started patching in settings for Hairworks into the game. Even the AA for it was adjustable then, so if they could add it all; why didn't it get put in before launch?

It could simply have been due to the deadline to get the game out; they already delayed it once before. Plenty of more performance patches came out afterwards as well.

4wUEnRJ.png


The more likely scenario was that Nvidia was forced to allow inclusion of the hairworks slider after it was discovered that setting the tessellation to 8X in Catalyst Control Center improved performance a lot for AMD cards.

Prior to this Nvidia had been claiming that the Kepler cards and AMD cards could not handle the tessellation but when the AMD tessellation 'fix' was found then Kepler owners also demanded such a feature and CDP added it into the game.
I somehow doubt we would have seen the feature had AMD not included a tessellation option in their drivers after the Crysis 2 water tessellation scandal. No one would have realised that Hairworks was using 64X tessellation and Nvidia would have claimed that only Maxwell cards can handle Hairworks properly thus forcing many to upgrade.
 
To be honest, there is not that much of an IQ difference between x8 and x64 Tessellation. I think a few sites ran a piece on this and the pictures showed that bellow x8 it became obvious. Otherwise x8 - x64 showed little difference.
 
The more likely scenario was that Nvidia was forced to allow inclusion of the hairworks slider after it was discovered that setting the tessellation to 8X in Catalyst Control Center improved performance a lot for AMD cards.

Prior to this Nvidia had been claiming that the Kepler cards and AMD cards could not handle the tessellation but when the AMD tessellation 'fix' was found then Kepler owners also demanded such a feature and CDP added it into the game.
I somehow doubt we would have seen the feature had AMD not included a tessellation option in their drivers after the Crysis 2 water tessellation scandal. No one would have realised that Hairworks was using 64X tessellation and Nvidia would have claimed that only Maxwell cards can handle Hairworks properly thus forcing many to upgrade.

It was CDPR that added the sliders though, not NVIDIA. (Just like Nixxes are the ones improving on HBAO+ in Rise of the Tomb Raider.) That's what makes the whole situation so curious. Also Kepler and Hawaii simply are not as good at tessellation as Fiji and Maxwell; that's a fact.

With Everything maxed out a Fury X and 980Ti are essentially equal in the game, but the old cards will struggle.

The entire thing is strange because CDPR obviously knew about x64 tessellation, they've used the GTX 780 Ti for a long time in development and knew it hurt those cards as well.

Sadly they didn't have it all finished on launch, and none of us know why. Although considering they had to delay the game once before it's probably because of the launch deadline. Prior to the first delay they had to scrap their entire rendering engine as well, and the in game hair physics system; they even use a 3rd party solution for asset streaming as well.
So they were obviously just looking for something that worked for them. There were musings of also getting TressFX in as a second option, but that sadly never happened. I wonder if it was because getting animal/monster fur on it might have involved too much extra dev work.

To be honest, there is not that much of an IQ difference between x8 and x64 Tessellation. I think a few sites ran a piece on this and the pictures showed that bellow x8 it became obvious. Otherwise x8 - x64 showed little difference.

Yup, x4 tessellation looked horrific :/

pB0AWMi.png

Dfue9Ja.png
 
Last edited:
Nvidia also supports that before we get any mudslinging lol, so it can only benefit all

I guess you right but what this does do, is helps AMD on PC by using the same GPU GCN on both platforms.

Nvidia don't use GCN so there close to metal performance might be different.
For example if X game is built on Ps4 GCN and optimised to run on that architecture that same optimisation can be applied to desktop GCN Gpus...

PS4 - GCN - PC GCN - AMD
PS4 - GCN - PC ??? - Nvidia

Where does that put Nvidia? Remains to be seen we only just touching all this Low level API stuff but I can see next Year being a very interesting year for PC gaming. :D
 
Well, if Mantle had issues with different variations of GCN, then I'd imagine that something targeting the hardware at an even lower level will be even more problematic. Machine code level optimisations that work on the PS4 specific GPU, may not translate to the different versions of GCN in use on the PC.
 
Well, if Mantle had issues with different variations of GCN, then I'd imagine that something targeting the hardware at an even lower level will be even more problematic. Machine code level optimisations that work on the PS4 specific GPU, may not translate to the different versions of GCN in use on the PC.

It was only BF4 that had issues though, all other Mantle titles was fine. I also sure in the end the Bf4 issues was fixed "Not 100% Sure though" Humbug can answer that though I remember he had a GPU that did have the memory issue.

Tbh this might be something completely different anyways.
 
It was only BF4 that had issues though, all other Mantle titles was fine. I also sure in the end the Bf4 issues was fixed "Not 100% Sure though" Humbug can answer that though I remember he had a GPU that did have the memory issue.

Tbh this might be something completely different anyways.



BF4 ran but not very well on my Tahiti LE, (GCN 1.0)
on the 290 it was perfect (GCN 1.1)

I still have this video.

 
Back
Top Bottom