• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How much performance does TressFX cost?

4K

Off

KOGszMZh.jpg.png


On

gDgL9Tah.jpg.png


24.22%
 
GTX980Ti @ 1026 / 1753mhz (boost 1114mhz).
i5 3570K @ 4.1ghz
355.98 driver.
1200p.


Average fps with tressfx on : 128.2.

33tm728.png


Average fps with tressfx off : 200.1.

552lwj.png


Performance hit : 35.93%.
 
Last edited:
Good thread, conclusion is Tress FX causes similar performance hit for AMD and Nvidia, while conversely Gameworks causes a much more drastic hit for AMD cards VS Nvidia cards.
 
More importantly it was a sensible addition to a game with the performance to spare. Another thing to note is the performance hit is lesser the higher the resolution. The answer to that is the time taken to render. AMD I think suggested it cost say 3-4ms. Now if you're at 120fps at a lower res you're doing a frame every 8.3ms, so adding 3-4ms is huge. At 60fps, usually achieved by higher resolution, the frames are 16ms apart, and with the same 3-4ms hit you get a smaller effective performance loss.

This is really the key with most 'add on' effects, is you have X time taken for the main game render, then depending on how much time you have spare between X and a sensible frame target like 60/90fps, you can add effects up to that point without seemingly losing performance.

Say you are targeting 60fps/16.67ms frame times and your main game render completes on average after 10ms, then you maybe add 3 effects taking 1ms each and another taking 3ms without breaking that 60fps target.

One of the biggest issues with DX11 is random slowdowns and unpredictable nature of when certain things will occur. With DX12 and with consoles you can get a far more predictable time per frame and add on things up to your frame target much MUCH more accurately.

Also worth noting is that tressfx 2.0 pretty much dropped the frame time delay by half or more iirc, from around 3 to 1ms on a 280x and it's TressfX 3.0 being used in Deus Ex.

The highest quality level used in Tomb Raider was used to target high end cards and give still really strong performance when used, which is what it was able to do on cards at the time, it also didn't penalise Nvidia in any way.

God Rays in Fallout are both ridiculous in performance hit, punishing AMD more by the looks of it and providing awful game play. Devs adding performance options that provide an unplayable experience is rather silly, making use of the hardware is good.

The biggest issue as I see God rays in Fallout is that fallout itself looks terrible overall. A lot of the textures are simply embarrassing. Good lighting can improve how textures look, so good enough textures can work well with great lighting. Terrible textures still look terrible even with great lighting.

Tressfx also looked really very good where the god rays don't offer above and beyond what other god rays have done in other games not using Nvidia's gameworks to do it.
 
More importantly it was a sensible addition to a game with the performance to spare. Another thing to note is the performance hit is lesser the higher the resolution. The answer to that is the time taken to render. AMD I think suggested it cost say 3-4ms. Now if you're at 120fps at a lower res you're doing a frame every 8.3ms, so adding 3-4ms is huge. At 60fps, usually achieved by higher resolution, the frames are 16ms apart, and with the same 3-4ms hit you get a smaller effective performance loss.

You really know how to dig a hole for yourself.

The following is not for the scoreboard just for DM as it uses SLI.

4 TX SC @1316/1752 stock
5960X @4.0
358.87 drivers
2160p

TressFX off

oCvlR6s.jpg


TressFX on

A7QJ5he.jpg

Performance hit 27.23%

@DM as you can see not only are the fps higher @2160p using 4 TXs but the performance hit is still smaller than the 1080p result.
 
I like that GodRays are evil because they cause a 10% decrease in framerates but TressFX is the poster boy for 3rd party libraries and it causes 30%+ decrease in framerates. But that's ok because we're running it on current hardware and the game it's used in isn't demanding...
The fact the game it's being used in isn't demanding doesn't make the fact it's a 30%+ hit good. Let's pick a more demanding TressFX game and see if it's still 30%+.
Does Tomb Raider have a sliding scale for TressFX effects or is it just on or off?

Definitely interesting results and not what I'd been led to believe about TressFX having a small performance hit. Good that it's even across hardware though. But negatively affecting both vendors performance was bad when HairWorks did it in The Witcher 3...

Out of interest what was the performance hit of HairWorks in The Witcher 3, did anyone test that?
 

In all your massive words, you miss the fact that Nvidia absolutely dominate in this game however, if Nvidia sponsored it, it would have been a mass accusation of "gimpworks", Nvidia purposefully hurting AMD etc etc but we can see that the performance hit is the same for what GameWorks gives in Batman. I am sure if we did other tests in other GameWorks games, it would be a repeating story but don't let common sense spoil a good conspiracy.
 
You really know how to dig a hole for yourself.

The following is not for the scoreboard just for DM as it uses SLI.

Performance hit 27.23%

@DM as you can see not only are the fps higher @2160p using 4 TXs but the performance hit is still smaller than the 1080p result.

What precisely is your point? I said the performance hit is smaller at a lower fps you get at a higher resolution, so you post something at a high resolution giving lower overall impact on performance? Why would you expect it to change in SLI, throwing more power doesn't change the equation you just change the performance level. With all the results the performance impact of TressFX is lower at higher resolution precisely for the reasons I stated.
 
In all your massive words, you miss the fact that Nvidia absolutely dominate in this game however, if Nvidia sponsored it, it would have been a mass accusation of "gimpworks", Nvidia purposefully hurting AMD etc etc but we can see that the performance hit is the same for what GameWorks gives in Batman. I am sure if we did other tests in other GameWorks games, it would be a repeating story but don't let common sense spoil a good conspiracy.

I posted some stuff that really wasn't a go at Nvidia and all the Nvidia boys come in to have a go back at me while completely misunderstanding what I was saying.

First up, Nvidia dominate in this game, is there some relevance to that? Who cares? My point stands, adding something that takes off 30fps to a game that at launch can get 90-120fps pretty easy isn't a huge deal. Adding something that takes off 30fps to a game people struggle to get a solid 60fps is a different matter. Not sure if you can comprehend something that simple or not. How is that Nvidia bashing, if anything it's dev bashing. The game looks like dirt for the performance it requires without godrays, the godrays take an absurd amount of power and don't improve the graphics while making a circa 2009 looking game perform like it's running on circa 2009 hardware as well.

TR was TressFX's first outing used by the first developer. This is not Nvidia's first usage of god rays/gameworks package, it's not remotely new.

Comparing performance of a 'new' effect that was used appropriately to give a fairly large performance hit in a game that could take the performance hit for a noticeable IQ improvement, to that of an old effect used inappropriately taking an absurd amount of performance giving a basically invisible IQ 'increase'.

As usual the Nvidia guys see a post and jump all over it, entirely misunderstanding it and throwing out a load of horse crap.

You add the god rays option in and there is an actual IQ improvement, different matter, you add them in taking 30fps off in a game that gets 100fps easily, different matter, you take out absurd levels of simply tessellation added for a fairly obvious reason, different matter. FO4's implementation isn't well done. TressFX was used pretty well in Tomb Raider.

There is a resounding "god rays aren't remotely worth the performance hit" from pretty much everyone playing Fallout 4, for Tomb Raider everyone who had higher end hardware ran TressFX.... says it all really.
 
Updated. Please do humbug, interested to get as many results as possible. Someone should setup a proper Gameworks one too, The Witcher 3 would be a good one to use as anyone can use HairWorks in it.

Please also keep the thread on topic. Performance hit so far seems to get lower as the res increases, would be nice to see if thats the game with HairWorks. HairWorks vs TressFX would be an interesting direct comparison.
 
I posted some stuff that really wasn't a go at Nvidia and all the Nvidia boys come in to have a go back at me while completely misunderstanding what I was saying.

First up, Nvidia dominate in this game, is there some relevance to that? Who cares? My point stands, adding something that takes off 30fps to a game that at launch can get 90-120fps pretty easy isn't a huge deal. Adding something that takes off 30fps to a game people struggle to get a solid 60fps is a different matter. Not sure if you can comprehend something that simple or not. How is that Nvidia bashing, if anything it's dev bashing. The game looks like dirt for the performance it requires without godrays, the godrays take an absurd amount of power and don't improve the graphics while making a circa 2009 looking game perform like it's running on circa 2009 hardware as well.

TR was TressFX's first outing used by the first developer. This is not Nvidia's first usage of god rays/gameworks package, it's not remotely new.

Comparing performance of a 'new' effect that was used appropriately to give a fairly large performance hit in a game that could take the performance hit for a noticeable IQ improvement, to that of an old effect used inappropriately taking an absurd amount of performance giving a basically invisible IQ 'increase'.

As usual the Nvidia guys see a post and jump all over it, entirely misunderstanding it and throwing out a load of horse crap.

You add the god rays option in and there is an actual IQ improvement, different matter, you add them in taking 30fps off in a game that gets 100fps easily, different matter, you take out absurd levels of simply tessellation added for a fairly obvious reason, different matter. FO4's implementation isn't well done. TressFX was used pretty well in Tomb Raider.

There is a resounding "god rays aren't remotely worth the performance hit" from pretty much everyone playing Fallout 4, for Tomb Raider everyone who had higher end hardware ran TressFX.... says it all really.

Are you talking about the difference between GodRay on/off or GodRays Ultra/Low?
Are we talking about TressFX on/off or TressFX Ultra/Low?
I'll admit there's very little noticable difference between Low GodRays and Ultra GodRays, however I think GodRays off or GodRays Low is a noticable difference. Also GodRays off to GodRays Low is a 10% performance hit (according to Nvidia's graphs at least). That's 3 times less than we're seeing from TressFX.

Nice thread thebennyboy, I hadn't realised how unoptimised TressFX was. Maybe they should be called TresCherFX...
 
Back
Top Bottom