• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Huddy on Gameworks round 3

What I don't understand is just how this is all NVidia fault in the first place. surely it is down to the developer on how they use gameworks.
I mean I cannot imagine that code for the game goes something like this.

Witcher 3

code
code
code
code
do gameworks ( out pops a fully tessellated furry wolf)
code
code
code
do gameworks (out pops the geralts over tessellated hair)
code
code

so on and so forth.

Surely the developer has to tell gameworks what to do and then it does it, rather than it magically knowing what the developer wants it to do, else how does it know in the above example to make a wolf first time around and the guys hair the second time around.
And don't try to tell me that every time a developer uses gameworks an NVidia engineer comes round and does the code for them as that is ridiculous.
 
Some also don't give a **** at all, just look at the vast majority of them. Look up Slaughtering Grounds and their developer Digital Homicide to see a prime example.

Also one Indie game I know of that is using Gameworks is Ark: Survival Evolved.

Ark runs like cr*p on most cards except maybe TX and 980/980Ti. All thanks to Gameworks of course.
 
Some also don't give a **** at all, just look at the vast majority of them. Look up Slaughtering Grounds and their developer Digital Homicide to see a prime example.

Also one Indie game I know of that is using Gameworks is Ark: Survival Evolved.

that game ate a month of my life, not a great example it does run like p00p, i wouldnt say its a good looking game either, just a super addictive one
and...well...dinos

lol ye i was just poking fun at pcars, there's a huge mountain or terrible ones
i not pld Slaughtering Grounds yet
super bad zombie game?
 
Ark runs like cr*p on most cards except maybe TX and 980/980Ti. All thanks to Gameworks of course.

Care to show some proof rather than guessing or circumstantial evidence? Maybe its more to do with the game still very much in development. Most game or only optimised fully when they are content & tech locked of obvious reasons.
 
Ark runs like cr*p on most cards except maybe TX and 980/980Ti. All thanks to Gameworks of course.

Of course. :rolleyes:

Unfinished Indie game using CryEngine runs mediocre at best. Gotta be NVIDIA.
Devs have also been improving performance with each patch as development continues.

http://steamcommunity.com/app/346110/discussions/0/594820473977643250/
Hardcore GPU optimization pass 1: expected gains ~10% on high-end, ~20% on mid-range, ~30%+ on low-end.

It still runs better than Mechwarrior Online does which uses the same engine. :p

that game ate a month of my life, not a great example it does run like p00p, i wouldnt say its a good looking game either, just a super addictive one
and...well...dinos

lol ye i was just poking fun at pcars, there's a huge mountain or terrible ones
i not pld Slaughtering Grounds yet
super bad zombie game?

Oh man, it's horrific. The dev has put out 5 games in under a year and yet claim they care about quality and gameplay.

I can't link anything because of swearing but simply google "The Slaughtering Grounds Jim Sterling".

There are so many devs out there like that.

I'll always be a firm believer that Bad developers put out bad games, it's their code, their choice in middleware, and their use of said code and work.

Lest we forget "Ubisoft Keep on Digging", where even before Gameworks their games ran like tat and they stated they don't optimise for PC because we can just buy a better graphics card.
 
^^^< yep most developers are pretty bad. Actually, it mostly comes down to restrictions and constraints of the publisher, e.g. hitting the game before xmas/summer deadlines and just general constraints that small developer suffer, e.g. lack of engineers/artiststs/time to optimize. The game industry as a whole is notorious for poor working conditions, over-worked programmers, impossible yet inflexible deadlines, etc.

Gamesworks has nothing to do with this and doesn't make games worse. If a developer doesn't have time to implement a features faceworks then gamesworks is a tempting solution that is bound to be better than what a an overworked under staffed indie could throw together.
 
They offer the best gaming experience that's why, if AMD had their way NVidia will be as lazy as they are and we'll all lose rather than just AMD owners.

Huddy claiming that GameWorks is hurting NVidia's older cards well that's just the price of progress, something AMD haven't made in about 4yrs when it comes to their primary bugbear that is tessellation, it's reached a point where they are now cheating in drivers to obtain satisfactory performance. GTX970/980 are considerably better architectures than Kepler despite what most people seem to think so understandable Kepler is struggling more in recent games.

The state of play is pretty much this:

- Developers develop games to the DirectX spec, sometimes they put in time optimising different vendor GPU's sometimes they don't. Their responsibility is and always has been to make DirectX compatible games, nothing more.

- NVidia go out of their way to ensure the majority of games are 100% optimised for their GPU's and some even include middleware they have developed.

- AMD don't do any of that except on a tiny number of sponsoreed games or AAA titles (GTA V), instead they sit inside their HQ portraying what NVidia is doing as a bad thing because they don't have the same level of commitment to their customers.

- AMD attempted to offload the responsibility of optimising for their GPU's to developers by developing Mantle, developers who weren't paid to promote it were understandably cold towards it because it just meant more work for them with zero benefit.

I don't get why you keep blaming AMD because they're not doing the developer's jobs for them... If a developer creates a game for a console, is it up to AMD or Nvidia to release updates so that it runs well? It's the developer's responsibility to ensure that it runs as well as it can on the hardware it's intended to run on. Most of the time, it's not the drivers at fault at all but poorly programmed games which AMD is not at fault for. They are not the ones who write the shoddy code in the first place and there's only so much they can do from the driver side, being unable to change the hardware or the source code of the games.

GPU manufacturers should not have to release new drivers for every game. It's non ideal for both the manufacturer and the game developer, since driver optimisations may change the way that the game is rendered, making it different from how the developer originally intended it to be. E.g. AMD could create a driver which essentially turns tesellation to low when it detects a hairworks game being run. This would obviously increase performance with hairworks on AMD cards but then they would be accused of "cheating" despite that change being an optimisation.

If you're slating AMD for creating Mantle which allows developers to have lower level access, then what do you think of DX12 or Vulkan? They are essentially the same idea as mantle but more polished. With lower level APIs, the blame will be directly on developers if games are poorly coded, as it should be. The developers and publishers have no excuse for poor, half-assed and lazy programming when they charge extortionate prices for games.
 
GPU manufacturers should not have to release new drivers for every game. It's non ideal for both the manufacturer and the game developer, since driver optimisations may change the way that the game is rendered, making it different from how the developer originally intended it to be. E.g. AMD could create a driver which essentially turns tesellation to low when it detects a hairworks game being run. This would obviously increase performance with hairworks on AMD cards but then they would be accused of "cheating" despite that change being an optimisation.

While I agree they shouldn't need to release a driver for every game, with some they should especially for Crossfire Profiles which are sometimes horrendously late, and when they know their cards have issues well before launch day.

As was the case with Project Cars, where they simply stopped responding to the developers and didn't work with them until after the game came out.

Also reducing tessellation on the driver level is not an optimisation. Otherwise you can call lowering graphical settings in a game Optimising the game. It's not accurate.

What they should do, and should have done was actually work with the developers when the devs actually tried to reach out to them. Especially when the developers find they're having issues with the hardware running their game and they've done as much as they can on their own.

I'm sure if AMD actually responded to the devs and worked with them before hand the game wouldn't have had such issues on launch.
 
Lowering tesselation levels is an optimisation if they are set higher than required at default, giving no visual quality enhancments.

It is not that amd's tesselator is sub par, just that nvidias is stronger. But if x64 gives no visual improvements over x16-x32 then it is a waste of gpu resources. And in some circumstances used at detriment to older gpus and competitors gpus.
 
That is because no-one can optimize away AMD"s poor tessellation performance. That is a hardware issue that AMD will need to resolve in the future.



Source code makes absoltue;ly no difference. For a vast majoirty of games Nvidia and AMD don;r get any source code full stp, and they don't need it. Game developers will send a pre-release binary to Nvidia/AMD where they can reverse engineer the draw calls because guess what, eery single draw call emanating form the game has to go through the friggin driver that nvidia/AMD wrote. It makes absolutely no difference if that draw call was a result of gameworks, tressFX or something the developer wrote.



If AMD are too incompetent to optimize their divers without seeing source code then they simply should not be in the business of selling GPUs and they should do the honorable thing and hand over the reign to a company that has more than a brain-dead Capuchin monkey at a keyboard.
The facts is that is pure lies, AMD are very definitely capable of optimizing gameworks titles without gamesworks source code, and they can do that long before the game is ever launched by working with the game developer. But here is the thing,AMD don't, they simply don't spend the resources and don't manage developer relationships to anywhere near the extent that NVidia does. Instead they find it much cheaper to shove Huddy on stage to spout PR nonsense that the AMD diehards can regurgitate on forums.

As you seem to believe it's your sacred duty to reply to every post in the GPU forum with an all knowing attitude, you should already know that AMD have been working on improving their architecture's tessellation performance:

IxApVVl.png


Here we can see concrete evidence that AMD made a good improvement from Hawaii to Fiji - that's a solid 25% improvement in one generation.

I expect their next architecture to match NVIDIA's tessellation performance.
 
Of course I know AMD have improved their tesselation performance, i have pointed that out mutliple times and that is exaclty why the Witcher 3 shows such a good performance increase if hairworks is switched on.

That just validates the fact that it was an issue with AMD drivers and hardware that made the Hawaii cards underperform in TW3 and not some deliberate sabotage from NVidia like AMD tried to pawn off.

Benching the Witcher 3 with hairworks on with a FuryX against a 290X is proof that AMD just gush verbal diarrhea in their marketing spin.
 
Care to show some proof rather than guessing or circumstantial evidence? Maybe its more to do with the game still very much in development. Most game or only optimised fully when they are content & tech locked of obvious reasons.


Apparently ARK Survival Evolved was developed on a 980 so this is the first video of a 980 running at medium I found just a over a month old..

https://www.youtube.com/watch?v=ddgKFMiMbns

About 30-45fps so unless you have a higher end system you'll be looking at 30fps or less on low/medium settings. It will no doubt improve but it sure don't look like the trailer just yet unless you have SLI TX or 980's.


A newer video showing a TitanX running at various detail levels..
https://www.youtube.com/watch?v=RA26IFO5CYs


Only gets 25fps on epic and 35fps on High..
What the hell does this game need to run properly :eek:.
 
Last edited:
Quite true actually, As horrible as it seems I can see Nvidia being the only GPU manufacturer in the future.

AMD need a card out at the same time as Nvidia release their top card that literally blasts it off it's feet, Not matches it but tramples it to death then goes back the next day and takes a pee on it.

AMD need a solid winner if they are to succeed.
It is starting to look that way, people are not only becoming fanboys but becoming blind fanboys that don't even realise when the industry is headed off a cliff. It's one thing to buy Nvidia when there is healthy competition but the old slogan of backing with your wallet seems to be ignored when we're at risk of having Nvidia be the only competition. Nvidia have already spiked prices as far as they could (hell, the titan x was an example of people willing to pay over £200 more than what AMD was willing to sell a closely performing card and forced them to release the 980ti just to compete) and people were lapping them up but them up at that price. Fury comes out and we get a bunch of garbage talk from fanboys saying it's not worth the money every chance they can get and saying AMD needs to shut up and outperform Nvidia.

To be fair, they nearly did if it wasn't for Nvidia being forced to chop down there card and drastically drop the price. If we lose AMD we'll see those prices creep up another 200 / 300 again each time we want that kind of performance. If people are that blind to the industry and wanting to cry every time AMD's cards are 2 or 3% below Nvidia's then I'm sure they'll enjoy what happens when Nvidia have the monopoly. Tools. Keep hearing the same rot from the same people too, 'AMD just need to shut up and do the same as Nvidia'. Sure, companies with 15% market share can just compete in every single way with the companies with 85%. We're not talking rot or devoiding ourselves of reality or nothing derp.
 
Last edited:
While I agree they shouldn't need to release a driver for every game, with some they should especially for Crossfire Profiles which are sometimes horrendously late, and when they know their cards have issues well before launch day.

As was the case with Project Cars, where they simply stopped responding to the developers and didn't work with them until after the game came out.

Also reducing tessellation on the driver level is not an optimisation. Otherwise you can call lowering graphical settings in a game Optimising the game. It's not accurate.

What they should do, and should have done was actually work with the developers when the devs actually tried to reach out to them. Especially when the developers find they're having issues with the hardware running their game and they've done as much as they can on their own.

I'm sure if AMD actually responded to the devs and worked with them before hand the game wouldn't have had such issues on launch.

It depends, if it impedes graphical quality whilst boosting performance then perhaps some would not consider it as a good optimisation. But with hairworks where there's good evidence that the level of tessellation is too high and does not provide a visible improvement over a lower setting, whilst negatively impacting performance, then it would be considered an optimisation. Changing settings for games at a driver level is what most driver optimisations are about these days.

This is an article which I read a few months ago and provides some decent insight into the work of GPU driver devs in creating driver releases for games. http://www.dsogaming.com/news/ex-nv...-every-triple-a-games-ship-broken-multi-gpus/

I'm sure that AMD would try to work with games developers as much as they could, especially with AAA titles, but there's only so much they can do if they cannot work on the source code itself. What's most likely happening is that the developers do not properly use the resources they are given by AMD or Nvidia, to aid them in developing games, such as documentation. I bet a lot of developers don't even properly read the developer guides and documentation created by AMD and NV.
 
Last edited:
What I don't understand is just how this is all NVidia fault in the first place. surely it is down to the developer on how they use gameworks.
I mean I cannot imagine that code for the game goes something like this.

Witcher 3

code
code
code
code
do gameworks ( out pops a fully tessellated furry wolf)
code
code
code
do gameworks (out pops the geralts over tessellated hair)
code
code

so on and so forth.

Surely the developer has to tell gameworks what to do and then it does it, rather than it magically knowing what the developer wants it to do, else how does it know in the above example to make a wolf first time around and the guys hair the second time around.
And don't try to tell me that every time a developer uses gameworks an NVidia engineer comes round and does the code for them as that is ridiculous.

It's up to the developer how the code is implemented and optimised, yes.
 
This is an article which I read a few months ago and provides some decent insight into the work of GPU driver devs in creating driver releases for games. http://www.dsogaming.com/news/ex-nv...-every-triple-a-games-ship-broken-multi-gpus/

I'm sure that AMD would try to work with games developers as much as they could, especially with AAA titles, but there's only so much they can do if they cannot work on the source code itself. What's most likely happening is that the developers do not properly use the resources they are given by AMD or Nvidia, to aid them in developing games, such as documentation. I bet a lot of developers don't even properly read the developer guides and documentation created by AMD and NV.

The main issue with AMD is the developers have reached out to them, several times, for years according to Project Car devs. AMD just stopped responding to them. So when the developer needed help to get the game to run well with AMD; it was AMD that refused to help.
When the developer has shown they tried to fix as much as possible and reached out to get share the game code, there's not much else they can do.

Most of the time it's all down to the developer, and I will always agree with that. Then there are these rare cases where the dev has been trying to get AMD to help, or at least look at their code to see why they can't get optimal performance and AMD just ignored them. It's rather nasty behaviour to do that, and then go on a rant about how the developer refused to work with them, and then also blame their rival ( NVIDIA ), then it was their own fault.

Then there are people actively here blaming Gameworks for a game's performance issues when the game is still a Year Away from release. Such as Ark: Survival Evolved. Yet only after the game has launched on Early access did they announce they were going to incorporate Gameworks.

Time line is Game launches on 2 June 2015 in early access, game is in Alpha Stage. Game runs badly on new engine( that's also not finished ), and is in early development.

3rd June: Developers announce they'll be incorporating Gameworks' features

Queue people blaming bad performance on NVIDIA's features; ignoring entirely the game isn't even finished yet and is a year away from release. Each update has been bringing performance improvements along with general game improvements and further development. They're even actively working to have the entire game run off DX12 for launch.
How can people here honestly blame Gameworks here?
 
Last edited:
Apparently ARK Survival Evolved was developed on a 980 so this is the first video of a 980 running at medium I found just a over a month old..

https://www.youtube.com/watch?v=ddgKFMiMbns

About 30-45fps so unless you have a higher end system you'll be looking at 30fps or less on low/medium settings. It will no doubt improve but it sure don't look like the trailer just yet unless you have SLI TX or 980's.


A newer video showing a TitanX running at various detail levels..
https://www.youtube.com/watch?v=RA26IFO5CYs


Only gets 25fps on epic and 35fps on High..
What the hell does this game need to run properly :eek:.



And that has nothing to do with what KillBoY_UK asked for!

Where is the evidence that gameworks has anything to do with those results. Do you have a comparison that runs identical graphicals but using an alternative middleware to Gameworks?
 
Lowering tesselation levels is an optimisation if they are set higher than required at default, giving no visual quality enhancments.

It is not that amd's tesselator is sub par, just that nvidias is stronger. But if x64 gives no visual improvements over x16-x32 then it is a waste of gpu resources. And in some circumstances used at detriment to older gpus and competitors gpus.

Lower tessellation isn't optimising. The same way that lowering graphics detail isn't optimising. Optimising is making it run better, not making it run less.
How do we know if AMD's is sub par or not? Surely it depends what you consider par? If Nvidia's implementation is considered par, then AMD's is sub par. If AMD's is considered par then it's not. If you average them, then it would seem that it is...

As you seem to believe it's your sacred duty to reply to every post in the GPU forum with an all knowing attitude, you should already know that AMD have been working on improving their architecture's tessellation performance:

IxApVVl.png


Here we can see concrete evidence that AMD made a good improvement from Hawaii to Fiji - that's a solid 25% improvement in one generation.

I expect their next architecture to match NVIDIA's tessellation performance.

Fury X's tessellation is 132% of the 290Xs. Which is good.
980Ti's tessellation is 133% of the 780Tis. Which is slightly better.

So why do you think that next architecture AMD will make a bigger improvement than Nvidia?
Maybe AMD's next architecture will match Nvidia's current one?
 
And that has nothing to do with what KillBoY_UK asked for!

Where is the evidence that gameworks has anything to do with those results. Do you have a comparison that runs identical graphicals but using an alternative middleware to Gameworks?

Well actually D.P using his logic we can see that since they've added gameworks since the original Early Access launch of June 2nd, performance has increased with major updates.

As such we can see that Gameworks actively increases game performance on all 3 OSs( Windows, OS X & Linux ) and on both AMD and NVIDIA hardware. /s :rolleyes:

It has NOTHING to do with the developers actively Developing the game on an incomplete engine ( Unreal 4 ), while also now working on a DirectX 12 version to have for the game's launch in Summer 2016.


This is like the outrage of the new Hitman game Alpha running poorly on a Titan X which recently made news. Even though that's a Square Enix game so will likely be an AMD Gaming Evolved title. Quick! Everyone grab the torches and blame AMD!
 
Last edited:
Back
Top Bottom