• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fallout 4 hindered on AMD cards by Gameworks > FIX

I didn't have any problems with my 970 at 1200p.

That can be tweaks in the GFE as well.

But on a serious note, Nvidia should be using an open source library of effects which AMD (and anyone else) can contribute too.

But Nvidia are not to blame for taking the initiative and developing their own library. Game after game, top AAA title after top AAA title are using it, so their doing something right.

Well it depends what people are happy with for FPS but having hairworks turned on for "all" and even just for "Geralt" had a considerable drop in FPS with default tessellation settings, pretty much everyone noticed this even nvidia users especially kepler owners.... Personally I much preferred Laura's tressfx hair in Tomb Raider and there is even less of an impact on FPS than Geralt's hairworks (iirc, initially it was a performance hog but that was quickly sorted and best of all, it didn't kill nvidia users FPS)

I know that they added an option in game for the tessellation regarding hairworks not long after the complaints.... so have nvidia now added a tessellation slider to their CP like AMD's CP?

I rather both GPU companies just concentrate on their GPU hardware and drivers and getting the best from them rather than adding their own effects, which look no better than any other good engine/game. Apart from hairworks/tressfx, there is not a single effect from AMD and more so nvidia that I have seen that looks any better than any other game/physics engine i.e. alien isolation for particle, smoke, fire effects looks far better than all the gameworks smoke, fire, particle effects imo. Heck, even some of the game engine settings look better than nvidia's gameworks and AMD settings i.e. in far cry 4, SSBC looks better than nvidia's HBAO+, in GTA 5, softest shadow setting looks better than nvidia's and AMD shadow preset....

IMO the only effect of nvidia's that really stands out for now is the volumetric smoke in batman i.e. the way the smoke moves when people walk through it.

And yes there is a reason.... it is called sponsorship :p i.e. nvidia pay them to use their effects and add their TWIMTBP logo just like AMD sponsor DICE/EA games to advertise their logo on startup as well as to use the likes of mantle etc.

Considering every game that nvidia sponsor comes out buggy and with performance issues; far cry 4, watch dogs, assasins creed, batman etc and even fallout 4 is being reported as being a buggy mess right down to just setting your resolution..... I wouldn't really say that nvidia being partnered with these games is a good thing... Whilst AMD don't add any effects or anything unique to their sponsored games i.e. alien isolation, bioshock infinite, hardline, PVZ, sniper elite 3, thief, hitman, SOMA, tomb raider etc. etc. At least they all work without any serious bugs and all run very well whilst looking very good too.
 
Last edited:
For those asking about the Gameworks Godrays/Volumetric Lighting implementation you may want to check the GDC 2014 .pdf presentation at the bottom of this blog post. It's not a tremendously complex method, and for the most part it doesn't seem too much hassle to implement. Its integration into Far Cry 4 "... took us one-man month to integrate it and one engineer..." according to one of the Ubi devs during a joint talk at GDC 2015 (27m10s). Completely different engines of course, but it does sound like a pretty decoupled system that any experienced team should be able to add to their engine, and without serious side-effects to performance beyond the effect doing what it's supposed to. The few run-time effect settings that should be configurable do seem to be exposed as in-game console commands too.
 
Last edited:
Therein lies the problem. People think they are entitled to turn everything up to max and have it run perfectly on their old cards.

Erm hello?!

This is an ancient engine with bits tacked on, it's not unreasonable to think that a card in the last few years should max it

What don't you understand, how is it unreasonable to expect two 7970's to max out an engine from SEVEN+ years ago with a few lighting effects tacked on, some of those effects which have been out since time began.
 
Its what was intimated in your last paragraph regarding Nvidia sponsored games being a buggy mess and AMD sponsored games being fine.

I meant that with all these buggy games being associated with nvidia/gameworks not being a good thing and not a reason to be a "proud" nvidia owner.

People keep saying to move to nvidia GPU's because all the latest big titles have nvidia gameworks but really what's the point when the games seem to run bad for nvidia users too when they enable the effects as well as experiencing the same bugs as AMD users??? Supporting these buggy games just because they are nvidia sponsored just seems stupid imo.

Although I do find it odd how the majority of games sponsored by nvidia always seem to have issues (at least the big ones that I know of) where as the majority of AMD's sponsored games run very well (with the exception of BF 4), there has got to be more than to it than just the developers & publishers...
 
Last edited:
For those asking about the Gameworks Godrays/Volumetric Lighting implementation you may want to check the GDC 2014 presentation at the bottom of this blog post. It's not a tremendously complex method, and for the most part it doesn't seem too much hassle to implement. Its integration into Far Cry 4 "... took us one-man month to integrate it and one engineer..." according to one of the Ubi devs during a joint talk at GDC 2015 (24m10s). Completely different engines of course, but it does sound like a pretty decoupled system that any experienced team should be able to add to their engine, and without serious side-effects to performance beyond the effect doing what it's supposed to.

So you're saying that Nvidia GameWorks Godrays are in Far Cry 4 and while there is a performance hit (which surely you'd expect by adding an effect) it's not a huge hit (less than 10%).
Of course it seems (I don't have the game) that it's on or off in Far Cry, not Low/Medium/High/Ultra, so we don't know what level it's at.

In Fallout 4 on Low, the performance impact seems to be less than 10%. Medium would look to be a little more than 10%.

Of course these are from Nvidia's graphs and we don't know how it affects AMD. Unless it's a tessellation thing again I don't see why it would be much different for AMD. This doesn't seem to be a time when there's a huge impact on everyone just to spite AMD and their lower tessellation performance.

For reference:
Far Cry Guide

and
Fallout 4 Guide
 
Looking at the 970 against the 780Ti it looks like Tesselation has been set relatively high by the developer.
 
Last edited:
For those asking about the Gameworks Godrays/Volumetric Lighting implementation you may want to check the GDC 2014 .pdf presentation at the bottom of this blog post. It's not a tremendously complex method, and for the most part it doesn't seem too much hassle to implement. Its integration into Far Cry 4 "... took us one-man month to integrate it and one engineer..." according to one of the Ubi devs during a joint talk at GDC 2015 (27m10s). Completely different engines of course, but it does sound like a pretty decoupled system that any experienced team should be able to add to their engine, and without serious side-effects to performance beyond the effect doing what it's supposed to. The few run-time effect settings that should be configurable do seem to be exposed as in-game console commands too.

Which is largely the idea of GameWorks in that it is a largely decoupled system that can be added onto a game with minimal work by the developer - the original intention being to add effects that would be beyond the resources of small to medium studios to produce for themselves though that seems to have slipped a bit.

The implementation of god rays in FO4 is relatively comprehensive and "should" visually be significantly better than games using flare type systems for it i.e. far cry 1 but whether it is actually worth the extra performance hit and/or implemented efficiently is another matter.
 
Why have 1 engineer spend 1 month when you can have an engineer spend a few days with a 3rd party library, no need to re-invent the wheel every time.

Dear lord, really? You are posting in response to the devs of a game using Nvidia god rays in Far Cry 4 directly saying it took them a month to implement. But Mr Nvidia defender says if they only used Nvidia's 3rd party version they could have done it in only 3 days.
 
The implementation of god rays in FO4 is relatively comprehensive and "should" visually be significantly better than games using flare type systems for it i.e. far cry 1 but whether it is actually worth the extra performance hit and/or implemented efficiently is another matter.

I think the issue is it doesn't look much better over much less "intensive" games. I mean I'm not being funny but STALKER Clear Sky still has phenomenonal godrays, is open world, came out years ago and you could play that on a 5850. The effect isn't the issue. It's the pointless tessellation (again) that is.
 
I think the issue is it doesn't look much better over much less "intensive" games. I mean I'm not being funny but STALKER Clear Sky still has phenomenonal godrays, is open world, came out years ago and you could play that on a 5850. The effect isn't the issue. It's the pointless tessellation (again) that is.

Exactly
 
I think the issue is it doesn't look much better over much less "intensive" games. I mean I'm not being funny but STALKER Clear Sky still has phenomenonal godrays, is open world, came out years ago and you could play that on a 5850. The effect isn't the issue. It's the pointless tessellation (again) that is.

Any comparison charts to show the performance difference in STALKER with and without GodRays on? Just for comparison?
 
I think the issue is it doesn't look much better over much less "intensive" games. I mean I'm not being funny but STALKER Clear Sky still has phenomenonal godrays, is open world, came out years ago and you could play that on a 5850. The effect isn't the issue. It's the pointless tessellation (again) that is.

But but....tessellated godrays!!!! :eek:
 
Back
Top Bottom