• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fallout 4 hindered on AMD cards by Gameworks > FIX

According to this GodRays would almost halve your FPS in STALKER Clear Sky.
That doesn't sound like a small performance hit to me.

Clear sky was a demanding title for it's day and one of the early DX10 titles. It used a version of the Shadow of Chernobyl DX9 X-ray engine that was updated with DX10 and other things bolted on, Call of Pripyat was the next installment, Again using a version of the same engine with DX11 bolted on and God rays were less demanding. Developments meant to go forward as time progresses we are talking about a year or so between Clear sky and COP and six years since COP.

What it shows is that this sort of effect is hardly new and should have been perfected long ago.
 
Can't even play in 1440p, it'll work in if Windows and board less is ticked, but my fps in game is just locked to 37.5fps.

Playing at 1080p at the moment, would much rather 1440p.


Odd, I have a 4790k, 8gbs of 2133 ram and a single Sapphire Fury Tri-x, I'm running Fallout 4 at 1440p via vsr on a 1080p monitor and everything is on ultra (god rays on high) with motion blur off. My fps sits between 50 and 60 for the most part, Usually closer to 60. I've only spent a couple of hours in game so far though so once things get moving with more than just a few mole rats or bloatflies attacking things might change a bit.
 
Last edited:
Can't even play in 1440p, it'll work in if Windows and board less is ticked, but my fps in game is just locked to 37.5fps.

Playing at 1080p at the moment, would much rather 1440p.

I had the same problem and this worked for me, go to documents/my games/fallout 4 and open fallout.ini and falloutprefs.ini and change iPresentInterval=0 (set it to 0) in both files and make the files "read only".

For some reason the settings revert back if you don't make it read only.
 
Clear sky was a demanding title for it's day and one of the early DX10 titles. It used a version of the Shadow of Chernobyl DX9 X-ray engine that was updated with DX10 and other things bolted on, Call of Pripyat was the next installment, Again using a version of the same engine with DX11 bolted on and God rays were less demanding. Developments meant to go forward as time progresses we are talking about a year or so between Clear sky and COP and six years since COP.

What it shows is that this sort of effect is hardly new and should have been perfected long ago.

Maybe it pretty much is. Do people expect it to cost no performance?
 
Can't even play in 1440p, it'll work in if Windows and board less is ticked, but my fps in game is just locked to 37.5fps.

Playing at 1080p at the moment, would much rather 1440p.

just run 1 card mate till they fix crossfire, I'm getting 60fps at 1440p with crossfire disabled, so you should easily be achieving the same. Why is it locked to 37fps lol...have you not done the .ini tweaks to unlock it?
 
Last edited:
What it shows is that this sort of effect is hardly new and should have been perfected long ago.
Again, not how it works.

Effects are rarely ever 'perfected', especially not so quickly. Pretty much every effect in your settings list like ambient occlusion, depth of field, motion blur, texture quality, shadows, antialiasing, etc etc are all continually improved as time goes on.
 
Maybe it pretty much is. Do people expect it to cost no performance?

It's got to cost something but 30 fps for what is not much visually is crazy (That's what Nvidia's graph show's it to cost when going from low to ultra on a 980ti)

Again, not how it works.

Effects are rarely ever 'perfected', especially not so quickly. Pretty much every effect in your settings list like ambient occlusion, depth of field, motion blur, texture quality, shadows, antialiasing, etc etc are all continually improved as time goes on.

Again? What I'm showing is that 6 years ago the GSC dev's provided the same effect for less than a third less of the performance impact on an engine which like this one was a heavily tweaked older engine not one built fresh from the ground up with these features. They did it to the engine for CS and then made big improvements to it for the next iteration of the engine 18 months later. Of course things get improved over time and tweaked, That's stating the obvious (and it's exactly what I'd just said) but they are meant to advance not go backwards and 6 years later is an eternity in the gaming world.
 
Last edited:
GOTY for me.

I'm really liking it so far, I've just got the armor and fought the death claw, I put the god rays to low and fps doesn't budge from 60 now so it's running great too.

Definitely one of the better titles this year. I suppose most people will pick Witcher 3 over this but this'll get my vote from the look of things.

Just wanted to say how excellent this game is. I know people are still arguing over it and complaining but tbh? I can't even remember the last time five hours disappeared so damn quickly.

Not had one single crash yet either.


I haven't had an problems, It was acting a bit juddery at times in the vault but it's been fine since busting out.
 
Last edited:
It's got to cost something but 30 fps for what is not much visually is crazy (That's what Nvidia's graph show's it to cost when going from low to ultra on a 980ti)
If you sure such an expert then what should the performance cost be? Have you programmed God rays in a modern game engine and got better performance?

Again? What I'm showing is that 6 years ago the GSC dev's provided the same effect for less than a third less of the performance impact on an engine which like this one was a heavily tweaked older engine not one built fresh from the ground up with these features. They did it to the engine for CS and then made big improvements to it for the next iteration of the engine 18 months later. Of course things get improved over time and tweaked, That's stating the obvious (and it's exactly what I'd just said) but they are meant to advance not go backwards and 6 years later is an eternity in the gaming world.

Where is it 1/3rd of the performance cost?
From Googlay's link it cost 50% of the performance in Clear Sky.

You say it is the same effect but its not, it is similar, like saying these 2 games both have shadows therefore they must cost the same. have you any details to show that the implementation is the same, the accuracy, visual quality, flexibility?

You claim that this effect must be perfect by now but how come almost no games have it? The reason it is a complex effect to achieve and computationally demanding. The only implementations we have of it both show the performance impact to be significant, as expected.


Quake 3 had dynamic shadows back in 1998 but you can't compare them to what we have now.
 
It's got to cost something but 30 fps for what is not much visually is crazy (That's what Nvidia's graph show's it to cost when going from low to ultra on a 980ti)



Again? What I'm showing is that 6 years ago the GSC dev's provided the same effect for less than a third less of the performance impact on an engine which like this one was a heavily tweaked older engine not one built fresh from the ground up with these features. They did it to the engine for CS and then made big improvements to it for the next iteration of the engine 18 months later. Of course things get improved over time and tweaked, That's stating the obvious (and it's exactly what I'd just said) but they are meant to advance not go backwards and 6 years later is an eternity in the gaming world.

I think it's more like 40fps according to Nvidia's graph.
So play it on low, then it's like a 4fps hit...
Nobody's forced to run it at Ultra.
If you set uGrid to 13 it'll probably tank frames too and that's nothing to do with GameWorks. Do people feel the need to up that from it's default of 5 to 13? increasing from 5 to 11 will cost about 64fps at 1080p. Nobody's complaining about that though are they? Why? Because this is about complaining about Nvidia and uGrids is not an Nvidia thing.
 
Just wanted to say how excellent this game is. I know people are still arguing over it and complaining but tbh? I can't even remember the last time five hours disappeared so damn quickly.

Not had one single crash yet either.

Those that are arguing over it don't appear to have it :rolleyes:
 
I'm not having a problem with my performance, it's great on everything max so far but I keep getting stuck in the terminal? I see turning on V-sync helps but I can't find v-sync in the settings?
 
If you sure such an expert then what should the performance cost be? Have you programmed God rays in a modern game engine and got better performance?

The people who go around with an expert complex are easy to recognise, Hello. I don't claim to be or act like an expert unlike some I'm going with a common sense view which is obviously lacking in those unable to see that.

The visual difference is minimal between low and ultra. In fact it's hard to see a difference and almost impossible to see when actually playing the game yet it costs a huge chunk of performance. What exactly are you struggling with here?
 
Seems like a lot of speculation there. Does running Stalker on new hardware reduce the performance hit to less than 10%?
Do you want Fallout 4 to use Stalker engine?

I will bet the hit would be less than in FO4, running both games on modern midrange hardware. But Of course it's speculation, it was an example of how similar visual effects were achieved on older titles without the gimping that is present on FO4. For two points 1. GW is not needed for this visual effect. 2. The implementation is damaging and pointless (no visual difference between settings but a scaling performance hit, unable to turn off) and I'm saying that for both vendor users.

And no, I'm not saying that at all. Why Is it that the only way you answer a post is by attempting to put an absurd assumption in my mouth and attempting to argue that instead?
 
Back
Top Bottom