• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fallout 4 to feature Nvidia Gameworks

And how did clear sky implement it?

Remember that Clear sky had a bigger performance hit than Fallout 4 does when it comes to God Rays though, so maybe tessellation would've helpped with God Rays. I'm not sure a 50% performance hit is the implementation I'd choose. Also remember that clear sky had it implemented in-engine (I believe?) where as Nvidia had to do it via a generic library that I'm guessing won't use engine specific optimisations. That's something Bethesda would've needed to do. And even then it's dependant on the fallout 4 engine.

I suspect there are something that probably require tessellation (and something that probably don't). I believe based on The Witcher 3 and the slider that was added the amount of tessellation used is decided/controlled by the developer using the library.

It's all crap anyway, you'd have to have your tongue rammed up nvidia's arse to think tessellated Godrays is what we need.
I'd say the same bloody thing if it was AMD too.
 
The only way to measure the performance hit would be to play it on today's hardware and see what the drop is like now. Would it still cost 50% ? Apart from the odd new effects that Amd/Nvidia bring out i think developers should be doing there job off creating the game and Nvidia/Amd should be using the games for sponsorship and to get there hardware optimised.

Maybe we should wait 7 years and see how big the performance hit of God Rays in Fallout 4 is then?

It's a shame that developers can't all implement every effect imaginable in their engines when they make them, but I don't think it's unusual for developers to use 3rd party libraries. Why the people that make the hardware shouldn't make these libraries I don't know, surely they have the best knowledge of how to do them effectively (or should do)? Obviously if they're using hardware specific to one vendor or the other it's bad (I don't think tessellation is exclusive though).

It'll be interesting to see if, with DX12, the use of ASync shaders is frowned upon so heavily if one side does it better than the other.
 
Maybe we should wait 7 years and see how big the performance hit of God Rays in Fallout 4 is then?

It's a shame that developers can't all implement every effect imaginable in their engines when they make them, but I don't think it's unusual for developers to use 3rd party libraries. Why the people that make the hardware shouldn't make these libraries I don't know, surely they have the best knowledge of how to do them effectively (or should do)? Obviously if they're using hardware specific to one vendor or the other it's bad (I don't think tessellation is exclusive though).

It'll be interesting to see if, with DX12, the use of ASync shaders is frowned upon so heavily if one side does it better than the other.

Why wait 7 years. We are comparing like for like effects. Others have said we had God Rays as good as in Fallout 4 many years back. If they now cost little in performance and look just as good then why do we need tessellated God Rays that eat performance if there is no visual benefit.
 
Last edited:
Why wait 7 years. We are comparing like for like effects. Others have said we had god Rays as good as in Fallout 4 many years back. If they now cost next to know performance and look just as good then why do we need tessellated Godray's that eat performance if there is no visual benefit.

We are waiting 7 years to judge the impact of God Rays in Clear Sky why not do the same for Fallout 4? Maybe in time hardware will become more efficient and the hit will be less?

When it was released Clear Sky apparently suffered a near 50% performance hit, when it was released Fallout 4 apparently suffered a near 7% performance hit.

Maybe some of the issue that GameWorks has is that it has to be engine independent where as the Clear Sky effect was specific to the engine. Now you can blame Bethesda for not building it into the engine, but it seems a bit harsh to blame Nvidia for that.
 
We are waiting 7 years to judge the impact of God Rays in Clear Sky why not do the same for Fallout 4? Maybe in time hardware will become more efficient and the hit will be less?

You are missing the point, because if it was implemented in FO4 the same way its was in clearsky then it would look just as good without the hit , if you are going to implement an effect that has been done before but needs more hardware to do it then it better look better than what was done before, not look the same and arguably worse for 4 times the hardware power.
 
You are missing the point, because if it was implemented in FO4 the same way its was in clearsky then it would look just as good without the hit , if you are going to implement an effect that has been done before but needs more hardware to do it then it better look better than what was done before, not look the same and arguably worse for 4 times the hardware power.

Can it be implemented that way in the Fallout 4 engine? And why would you expect Nvidia to do that instead of Bethesda?
 
Your point is spot on though. The funny thing is with the new drivers the percentages are now favouring the Fury X by a decent margin :D

Though in all fairness, it's natural fury loses less percentage as it's more cpu capped in this game than 980TI is. So when you up resolution a bit, your gpu usage goes up. 1080p really is heavily cpu bound on AMD in this game (heck it's even partly cpu bound on nvidia).

Here is typical cpu bound scenario from FO4.

[email protected] Fury X 1100/550 1080p

FO4-CPUbound_zpsdbkczygh.png
41% GPU usage

And here is comparision shot with different cpu speed

[email protected] Fury X 1100/550 1080p

FO4-CPUbound3800_zps62hiww3t.png
38% GPU usage
 
Last edited:
Can it be implemented that way in the Fallout 4 engine? And why would you expect Nvidia to do that instead of Bethesda?

I dont care who did it and seeing as its an AAA title neither should be trying to push effects that cant be implement efficiently in the engine if that was the case, its not like the effect has never been done before and they can be excused.
 
@GoogalyMoogaly
And this 7 year later baloney.

10=10 implemented 7 years go.
1+1+1+1+1+1+1+1+1+1=10 but someone implements it this way today, this way takes 4 times as much to compute for the same result, an argument can not be made with well no one has done it this way before so wait 7 years and hardware will be able to compute it just as fast as the 10=10 way back then, the fact is the 1+1+1+1+1+1+1+1+1+1=10 is less efficient and that's not progress because the 10=10 way will still be 4 times faster on comparable hardware.
 
@GoogalyMoogaly
And this 7 year later baloney.

10=10 implemented 7 years go.
1+1+1+1+1+1+1+1+1+1=10 but someone implements it this way today, this way takes 4 times as much to compute for the same result, an argument can not be made with well no one has done it this way before so wait 7 years and hardware will be able to compute it just as fast as the 10=10 way back then, the fact is the 1+1+1+1+1+1+1+1+1+1=10 is less efficient and that's not progress because the 10=10 way will still be 4 times faster on comparable hardware.

So do you know what the performance hit is with Clear Sky or is it an assumption that these days it wouldn't be noticeable?

I wonder if there are any other effects that were implemented in older engines that have less of an impact now than the same effect in current engines?
Will we have a go at these too?
If AA has less of an impact in the Source engine than in the Frostbite 3 engine do we have a go at DICE?

To me this just seems like people want to have a go at Nvidia about something and every time a game with GameWorks is released it gives people an opportunity to.
Or when they release a new GPU...
 
So do you know what the performance hit is with Clear Sky or is it an assumption that these days it wouldn't be noticeable?

WTF who said anything about performance hit not being noticeable, stop making stuff up and putting words into peoples mouth, and we are done here as well.
 
Last edited:
Amazing. An age old measurement of relative difference is not good enough for this forum user. Percentages are no more. You haven't got a scooby.

By your argument a card running at 20 FPS and losing 4 FPS at high resolution (20% drop) is better at high resolution than a card running at 100 FPS and losing 20 FPS. Bonkers.

Just because you don't understand percentages doesn't mean that they're any less useful than normal.

Look, your argument is that 980ti has more fps and thus loses more due to percentages. I get it, but, how does your percentage argument works with Titan X and now Fury X with new drivers? With old drivers fury x lost around 30FPS, with new drivers it still loses around 30fps even though now Fury X has much more fps to lose.
 
Though in all fairness, it's natural fury loses less percentage as it's more cpu capped in this game than 980TI is. So when you up resolution a bit, your gpu usage goes up. 1080p really is heavily cpu bound on AMD in this game (heck it's even partly cpu bound on nvidia).

Here is typical cpu bound scenario from FO4.

[email protected] Fury X 1100/550 1080p

FO4-CPUbound_zpsdbkczygh.png
41% GPU usage

And here is comparision shot with different cpu speed

[email protected] Fury X 1100/550 1080p

FO4-CPUbound3800_zps62hiww3t.png
38% GPU usage

My remark was exactly that. Just an observations, but hey someone got butthurt and brought up %s, saying that 980ti and fury x are equally capable for higher resolutions. While I do understand his % argument, but this time it goes out of the window with new AMD drivers.
Everyone knows that Fury X is much more potent at 4k than nvidia cards, but my original comment was just an observation that's all ;)
 
The 4K results vs the 980TI are probably the result of Fiji being the more powerful GPU but strangled at lower res by Drivers that make less efficient use of the CPU than Nvidia.
 
Look, your argument is that 980ti has more fps and thus loses more due to percentages. I get it, but, how does your percentage argument works with Titan X and now Fury X with new drivers? With old drivers fury x lost around 30FPS, with new drivers it still loses around 30fps even though now Fury X has much more fps to lose.

If there is one thing I know and even though I suck at calculations, you have to compare percentages. It is no good saying that X card loses 30 fps, as that is meaningless but when you say X card loses 7% for example, that is a valid means.
 
We are waiting 7 years to judge the impact of God Rays in Clear Sky why not do the same for Fallout 4? Maybe in time hardware will become more efficient and the hit will be less?

When it was released Clear Sky apparently suffered a near 50% performance hit, when it was released Fallout 4 apparently suffered a near 7% performance hit.

I don't mean to sound rude but making up numbers doesn't help make the case for whether God rays needed to have such a huge impact in Fallout 4. God rays never cost 50% in Clear sky. The move from DX10 compared to DX9 had a big impact but that was the whole thing not one specific feature. They did decrease the performance hit but doesn't everything? For example the two main extra's in DX10 that added an additional hit were the Agroprom underground where they used a new way of doing smoke with DX10(new at the time) and the God rays at certain times of the day (morning) but it was DX10 as a whole not one specific feature. Turning those single features down or off as shown in the benches I linked earlier have never had anywhere near that sort of impact. And then stating a 7% loss in Fallout 4 when we were shown a 980ti losing 30fps by moving God rays from low to ultra, That was approximately a 30% performance hit just by changing the settings in one specific feature.
Maybe some of the issue that GameWorks has is that it has to be engine independent where as the Clear Sky effect was specific to the engine. Now you can blame Bethesda for not building it into the engine, but it seems a bit harsh to blame Nvidia for that.
Also the X-ray engine was no different to Fallout 4's in that it was originally a DX9 engine as used in Stalker SOC which then had DX10 bolted on for Clear Sky and then DX11 bolted on for COP, Oversimplified I know but I'm sure you get the idea, It wasn't an engine built to handle and run DX10 that was an addition added as visuals evolved just as they are now doing with Fallouts engine.

I'd blame Bethesda for using Nvidia's over demanding pre built options rather than bolting on there own version. For such a big overused engine there's no excuse.
Penny pinching, cost cutting, laziness ?
 
Last edited:
My remark was exactly that. Just an observations, but hey someone got butthurt and brought up %s, saying that 980ti and fury x are equally capable for higher resolutions. While I do understand his % argument, but this time it goes out of the window with new AMD drivers.
Everyone knows that Fury X is much more potent at 4k than nvidia cards, but my original comment was just an observation that's all ;)

You misunderstood my post. My point was if that artificial cpu bottleneck on AMD could be eliminated, or brought closer to nvidia level. Fury series would perform better at 1080p resolution than they currently are. Making the actual downfall when upping resolution bigger that it currently is.
 
Last edited:
Back
Top Bottom