Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
nvm.![]()
Snip
More importantly it was a sensible addition to a game with the performance to spare. Another thing to note is the performance hit is lesser the higher the resolution. The answer to that is the time taken to render. AMD I think suggested it cost say 3-4ms. Now if you're at 120fps at a lower res you're doing a frame every 8.3ms, so adding 3-4ms is huge. At 60fps, usually achieved by higher resolution, the frames are 16ms apart, and with the same 3-4ms hit you get a smaller effective performance loss.
Snip
You really know how to dig a hole for yourself.
The following is not for the scoreboard just for DM as it uses SLI.
Performance hit 27.23%
@DM as you can see not only are the fps higher @2160p using 4 TXs but the performance hit is still smaller than the 1080p result.
In all your massive words, you miss the fact that Nvidia absolutely dominate in this game however, if Nvidia sponsored it, it would have been a mass accusation of "gimpworks", Nvidia purposefully hurting AMD etc etc but we can see that the performance hit is the same for what GameWorks gives in Batman. I am sure if we did other tests in other GameWorks games, it would be a repeating story but don't let common sense spoil a good conspiracy.
I posted some stuff that really wasn't a go at Nvidia and all the Nvidia boys come in to have a go back at me while completely misunderstanding what I was saying.
First up, Nvidia dominate in this game, is there some relevance to that? Who cares? My point stands, adding something that takes off 30fps to a game that at launch can get 90-120fps pretty easy isn't a huge deal. Adding something that takes off 30fps to a game people struggle to get a solid 60fps is a different matter. Not sure if you can comprehend something that simple or not. How is that Nvidia bashing, if anything it's dev bashing. The game looks like dirt for the performance it requires without godrays, the godrays take an absurd amount of power and don't improve the graphics while making a circa 2009 looking game perform like it's running on circa 2009 hardware as well.
TR was TressFX's first outing used by the first developer. This is not Nvidia's first usage of god rays/gameworks package, it's not remotely new.
Comparing performance of a 'new' effect that was used appropriately to give a fairly large performance hit in a game that could take the performance hit for a noticeable IQ improvement, to that of an old effect used inappropriately taking an absurd amount of performance giving a basically invisible IQ 'increase'.
As usual the Nvidia guys see a post and jump all over it, entirely misunderstanding it and throwing out a load of horse crap.
You add the god rays option in and there is an actual IQ improvement, different matter, you add them in taking 30fps off in a game that gets 100fps easily, different matter, you take out absurd levels of simply tessellation added for a fairly obvious reason, different matter. FO4's implementation isn't well done. TressFX was used pretty well in Tomb Raider.
There is a resounding "god rays aren't remotely worth the performance hit" from pretty much everyone playing Fallout 4, for Tomb Raider everyone who had higher end hardware ran TressFX.... says it all really.