• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fallout 4 hindered on AMD cards by Gameworks > FIX

I think the issue is it doesn't look much better over much less "intensive" games. I mean I'm not being funny but STALKER Clear Sky still has phenomenonal godrays, is open world, came out years ago and you could play that on a 5850. The effect isn't the issue. It's the pointless tessellation (again) that is.

According to this GodRays would almost halve your FPS in STALKER Clear Sky.
That doesn't sound like a small performance hit to me.
 
For all the AMD/Nvidia war boys

https://www.youtube.com/watch?v=Zvd...aYD-FZQ04&annotation_id=annotation_1150636429

8 minutes on

EDIT:

AMD need a kick in the rear to remind them to get drivers out on day 1. There stupid because that's when everyone does there reports on how a game performs.
There not helping themselves.


This is the crux of the problem, If the game/engine is coded correctly/to spec the manufactures should'nt need to release a "game ready driver"

Imagine if you had to get a driver for your CPU/sound card.Motherboard.network adapter every time a new game came out. People would laugh at the developer
 
Imagine if you had to get a driver for your CPU/sound card.Motherboard.network adapter every time a new game came out. People would laugh at the developer

Back in the day, until Windows did horrible things to the sound sub-system, it wasn't unheard of to require updates for sound hardware for new game releases. These days very few games push the boundaries of what is possible with audio while more frequently push the boundaries with graphics.
 
Back in the day, until Windows did horrible things to the sound sub-system, it wasn't unheard of to require updates for sound hardware for new game releases. These days very few games push the boundaries of what is possible with audio while more frequently push the boundaries with graphics.

Still has to be within the API spec which both GPU manufactures follow.
 
Still has to be within the API spec which both GPU manufactures follow.

Thing is with graphics especially though there aren't so many limits on what you can do - shader languages aren't rigidly locked down, you can reuse buffers in many different ways that the API and/or GPU manufacturers couldn't possibly foresee, etc. through being inventive you can implement features often that later become standard in the next revision of the API. Older APIs even upto DX9 provided little more than a framework that you'd then build your effects with.
 
not as bad as I was expecting, pretty much flawless so far, locked at 60fps inside, outside sees drops to mid 50's though. This is Ultra everything, with TAA but godrays turn down to Low at 1440p, crossfire disabled just running 1 390. Not sure why CPU load is so low though.

22530146149_8210135cdd_h.jpg


22503994568_4e463d7607_h.jpg
 
Last edited:
not as bad as I was expecting, pretty much flawless so far, locked at 60fps inside, outside sees drops to mid 50's though. This is Ultra everything, with TAA but godrays turn down to Low at 1440p, crossfire disabled just running 1 390. Not sure why CPU load is so low though.

22530146149_8210135cdd_h.jpg


22503994568_4e463d7607_h.jpg

Good stuff. It will get better with patches/driver optimisations as well :cool:
 
I think the issue is it doesn't look much better over much less "intensive" games. I mean I'm not being funny but STALKER Clear Sky still has phenomenonal godrays, is open world, came out years ago and you could play that on a 5850. The effect isn't the issue. It's the pointless tessellation (again) that is.


http://www.moddb.com/members/davidme/images/call-of-chernobyl#imagebox


https://www.youtube.com/watch?v=1wpuGSgep00

Some shameless plugging for one of the gaming greats :)
 
Erm hello?!

This is an ancient engine with bits tacked on, it's not unreasonable to think that a card in the last few years should max it

What don't you understand, how is it unreasonable to expect two 7970's to max out an engine from SEVEN+ years ago with a few lighting effects tacked on, some of those effects which have been out since time began.
Because it's just not how these things work.

A developer could incorporate a resolution % slider that goes up to 1000%. Nobody, not even with 4-way Titan X's would be able to have playable framerates with everything maxed out like that. The god ray slider is something akin to that, just a bit less extreme. It cranks up the resolution/tesselation factor by magnitudes, more than anybody would really want or need in a practical situation, but it is offered as an option nonetheless. And that's fine. I know that KILLS those people who feel psychological pain in turning a setting down, but it's just how it works and it used to be a lot more common back in the day.

Saying it's an ancient engine with just a few bits tacked is also a bit of an exaggeration. Not miles from the truth, granted, but there's still a lot more going on than some people give it credit for.
 
This is the crux of the problem, If the game/engine is coded correctly/to spec the manufactures should'nt need to release a "game ready driver"

Imagine if you had to get a driver for your CPU/sound card.Motherboard.network adapter every time a new game came out. People would laugh at the developer
Unfortunately, in an age of DX9/10/11, which we are still in, drivers are still a pivotal and large aspect of how a game functions and runs, especially with large and complex games. And it's just the simple truth that for maximum optimization(stability and performance), work on both drivers and on the game side are necessary.

This might change some when we get to DX12-exclusive titles, where the devs have a lot more control over things and the driver overhead is less important(though still necessary to some degree), but this also comes with the drawback of it requiring extra work for the devs to control many things that the DX11(or whatever) driver was doing before automatically.
 
Because it's just not how these things work.

A developer could incorporate a resolution % slider that goes up to 1000%. Nobody, not even with 4-way Titan X's would be able to have playable framerates with everything maxed out like that. The god ray slider is something akin to that, just a bit less extreme. It cranks up the resolution/tesselation factor by magnitudes, more than anybody would really want or need in a practical situation, but it is offered as an option nonetheless. And that's fine. I know that KILLS those people who feel psychological pain in turning a setting down, but it's just how it works and it used to be a lot more common back in the day.

Saying it's an ancient engine with just a few bits tacked is also a bit of an exaggeration. Not miles from the truth, granted, but there's still a lot more going on than some people give it credit for.


Its a very good point, imagine a game that had a resolution slider that would got to 16K. It wouldn't be played for the next 10 years. Shove it on 2K (1080p) and suddenly most people find it playable. Putting it on 2K is not putting it on "low" settings, it is putting it on something realistic. When Crysis came out few people would even run moderate setting ant moderate resolution even with high end hardware. When crisis came out you could set 4K with 4X AA which would be a complete joke, and is still very tough today.



The game engine itself being old doesn't at all reflect the performance expectation. In fact you can easily argue the opposite - and old game engine is not well optimized for modern GPUs. This is commonly the case, a modern engine can be made much more efficient for modern hardware.
 
According to this GodRays would almost halve your FPS in STALKER Clear Sky.
That doesn't sound like a small performance hit to me.

I didn't say it had a small performance hit when it was released. I'm saying the effect was done with minimal difference to FO4's version on much older hardware years ago on a open engined game.

The comparison you would have to do see the difference of enabling them on a midrange card of today for clear sky compared to FO4. I would hint that the performance hit is nowhere near the same.

The effect doesn't need GW, and it doesn't have to have the hit on modern cards as FO4's implementation does. That's the issue people are trying to get across.
 
Just wanted to say how excellent this game is. I know people are still arguing over it and complaining but tbh? I can't even remember the last time five hours disappeared so damn quickly.

Not had one single crash yet either.
 
Good stuff. It will get better with patches/driver optimisations as well :cool:

its already good Greg, Bethesda games are known for being crud at anything higher than 60fps, so there's not much point in having anything higher than that to be fair. I'm happy as I'm getting exactly where I wanna be, 60fps and there's no difference between Ultra god rays or low god rays anyway. Just typical gimping.
 
not as bad as I was expecting, pretty much flawless so far, locked at 60fps inside, outside sees drops to mid 50's though. This is Ultra everything, with TAA but godrays turn down to Low at 1440p, crossfire disabled just running 1 390. Not sure why CPU load is so low though.

22530146149_8210135cdd_h.jpg


22503994568_4e463d7607_h.jpg

What drivers are you using? I have i7 4790k at 4.4ghz and R9 390 and I'm getting 30-35fps on 1440p with everything on high except god rays. I can't get no where near 60fps at 1440p and reading your post and reviews I feel something is wrong with my setup. I also have the same monitor as you running 144hz with freesync enabled.

I'm forced to play in 1080p currently and I'm getting 80fps+ on ultra with frame rate unlocked but the game looks blurry I don't understand it :(


edit: I fixed this problem by setting iPresentInterval=0 in the ini files and making them read only. Runs great now and 1440p looks so much better, such a difference from 1080p.
 
Last edited:
I didn't say it had a small performance hit when it was released. I'm saying the effect was done with minimal difference to FO4's version on much older hardware years ago on a open engined game.

The comparison you would have to do see the difference of enabling them on a midrange card of today for clear sky compared to FO4. I would hint that the performance hit is nowhere near the same.

The effect doesn't need GW, and it doesn't have to have the hit on modern cards as FO4's implementation does. That's the issue people are trying to get across.

Seems like a lot of speculation there. Does running Stalker on new hardware reduce the performance hit to less than 10%?
Do you want Fallout 4 to use Stalker engine?
 
What drivers are you using? I have i7 4790k at 4.4ghz and R9 390 and I'm getting 30-35fps on 1440p with everything on high except god rays. I can't get no where near 60fps at 1440p and reading your post and reviews I feel something is wrong with my setup. I also have the same monitor as you running 144hz with freesync enabled.

I'm forced to play in 1080p currently and I'm getting 80fps+ on ultra with frame rate unlocked but the game looks blurry I don't understand it :(

Using the 15.11 drivers my friend.
 
Back
Top Bottom