• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does Gameworks kill performance? Lets find out...as a community

Soldato
Joined
25 Jun 2011
Posts
16,797
Location
Aberdeen
Been lots and lots of talk on the forums lately about Gameworks, does it hammer performance? Is it worth it? Lots of talk, very little substance.

So lets, peacefully, find out, between the lot of us we should have most games and the majority of hardware to do such a thing.

Here is a rough list of anything and everything featuring Gameworks

So what do we need to do?

Run a benchmark or sequence in the absence of a benchmark both with and without Gameworks (to include older 'physx' branded titles if possible). Calculate the percentage frame drop and post your results. Please state what card and resolution you're using along with settings.

This isn't a competition, nor does it really require much in the way of screen shots, though it may help in some instances.

Any baiting, bitching, moaning, derailing etc... will be RTM'd, go do that in the other thousand threads in this forum.

Example of what we're looking for:

Batman Arkham Knight
1440p
980Ti
Highest settings available
Benchmark mode
Gameworks available: NVIDIA Clothing, Destruction, Turbulence, Volumetric Lighting, PhysX (you can find these easily enough in the above link!)

No GW: 99 FPS
GW Enabled: 66 FPS

Cost: 33%

I'll then link results per title so we can gauge an idea of what is/isn't hammering performance.

GO!

Batman Arkham Knight

980Ti -13% Smoke disabled
980ti -29% fully enabled
980Ti -19% Smoke disabled
980Ti -34% Fully enabled
Titan X -33% fully enabled
 
Last edited:
Isnt it obvious that using fancy effects like GW will give you less performance ? It's a trade off, you either use them, use them at lower settings, use some of them or not at all.

Problem I read is that GW is supposed to break games lol.
 
4790K @ 4.80GHZ + 980 Ti @ 1500/2000


Batman Arkham Knight 1440P Max Settings


All Gameworks Settings On

Min: 43
Max: 109
Avg: 83


Gameworks With Smoke Off

Min: 67
Max: 143
Avg: 101


Gameworks Off

Min: 90
Max: 168
Avg: 117
 
The problem with gameworks for me is that the effects they enable are massively overdone to the point of lunacy.

Initially W3 had massively over tessellated x64 hairworks that used 8x MSAA. It was not until quite a few patches that Projekt Red enabled the option to reduce the tesselation and MSAA on hairworks. Nvidia are happy to gimp performance even on their own hardware as long as it makes AMDs look even worse. I played through W3 using GTX 980 then 980Ti and found gameworks added very little for massive FPS loss.

https://forums.geforce.com/default/...l-reason-why-witcher-3-hairworks-reduces-fps/
 
Isnt it obvious that using fancy effects like GW will give you less performance ? It's a trade off, you either use them, use them at lower settings, use some of them or not at all.

Problem I read is that GW is supposed to break games lol.

I hear what you saying, I expect GameWorks effects to lower performance. My issue is that most of the GameWorks effects out there add nothing to what we already have in-terms of IQ and performance.

GodRays with Tessellation WTH?? TXAA just another Blur fest, HBAO+ vs HDAO

I can really go on and on but I won't..
What I would like to just say nice idea on the thread, but would it not be better to say compare the feature against another. So lets say Godrays vs standard performance plus a screenshot of the two?

Just an idea
 
5930K @ 4.40GHZ + 980 Ti @ 1367/1774

Batman Arkham Knight 1440P Max Settings

All Gameworks Settings On

Min: 37
Max: 100
Avg: 69

Gameworks With Smoke Off

Min: 46
Max: 124
Avg: 88

Gameworks Off

Min: 56
Max: 144
Avg: 105
 
I can really go on and on but I won't..
What I would like to just say nice idea on the thread, but would it not be better to say compare the feature against another. So lets say Godrays vs standard performance plus a screenshot of the two?

Just an idea

I know what you're saying, but this is about performance numbers not image quality, sure do some tests and mention if you think those particular effects are worth the performance hit or not. Plus many of the effects only show their worth in motion so screen grabs wouldn't show the whole picture (no pun intended).
 
I hear what you saying, I expect GameWorks effects to lower performance. My issue is that most of the GameWorks effects out there add nothing to what we already have in-terms of IQ and performance.

GodRays with Tessellation WTH?? TXAA just another Blur fest, HBAO+ vs HDAO

I can really go on and on but I won't..
What I would like to just say nice idea on the thread, but would it not be better to say compare the feature against another. So lets say Godrays vs standard performance plus a screenshot of the two?

Just an idea

Why does it not surprise me that nvidia somehow shoehorned tessellation into godrays. Why in the hell would a light source need to be tessellated anyway? And from the pics it looks no different on or off, so what's the point? Throwing more geometry in to lower performance?
 
Problem is for a really meaningful comparison you need to compare a feature implemented via GameWorks with the same feature implemented by someone other than nVidia (with the assumption that the developers are half competent) if an equivalent feature implemented by someone else runs significantly faster it would raise questions.

Unfortunately this is quite a hard thing to do short of finding someone with the time to build tech demos using both from the ground up.
 
Gameworks is Nvidia's SLI marketing buy another card strategy. Obviously it will cripple performance its designed around SLI minimum even though in some games a high end GPU is enough.

Since 2011 & Batman Arkham City (which runs fully maxed out nowadays easily on most single GPU's) this seems to be Nvidia's strategy add Gameworks force highend gamers to buy another GPU from them or wait a few years until current single GPU's are fast enough.
 
Is it any different to when we had Crysis that crippled rigs. Were they also trying to force you to upgrade?

Not a fair comparison really.

Crysis actually looked perceivable MUCH better than anything else in 2007 though, you could see why you needed the extra power. These effects make little difference for the fps drop you get.
 
Been lots and lots of talk on the forums lately about Gameworks, does it hammer performance? Is it worth it? Lots of talk, very little substance.

So lets, peacefully, find out, between the lot of us we should have most games and the majority of hardware to do such a thing.

Here is a rough list of anything and everything featuring Gameworks

So what do we need to do?

Run a benchmark or sequence in the absence of a benchmark both with and without Gameworks (to include older 'physx' branded titles if possible). Calculate the percentage frame drop and post your results. Please state what card and resolution you're using along with settings.

This isn't a competition, nor does it really require much in the way of screen shots, though it may help in some instances.

Any baiting, bitching, moaning, derailing etc... will be RTM'd, go do that in the other thousand threads in this forum.



Your results are completely flawed because the only way to make a fair comparison is to show the same game running with the EXACT same effects but coded in a different way using a different 3rd party library or proprietary solution.


Of course Gamesworks effects has a performance cost, by definition they are complex effects that developers may not have the ability to do themselves. You can;t get those effects for free.


I Suggest you close the thread and come back when you can make an actual comparison.
 
The problem with gameworks for me is that the effects they enable are massively overdone to the point of lunacy.

Initially W3 had massively over tessellated x64 hairworks that used 8x MSAA. It was not until quite a few patches that Projekt Red enabled the option to reduce the tesselation and MSAA on hairworks. Nvidia are happy to gimp performance even on their own hardware as long as it makes AMDs look even worse. I played through W3 using GTX 980 then 980Ti and found gameworks added very little for massive FPS loss.

https://forums.geforce.com/default/...l-reason-why-witcher-3-hairworks-reduces-fps/


Why has W3 hair-works tessellation got anything to do with nvidia, it is the developer that failed to set the correct tessellation factor.
 
Back
Top Bottom