Soldato
- Joined
- 29 Aug 2010
- Posts
- 8,593
- Location
- Cornwall
Out of interest what is that?
It sounds like something you need to mod or do via a console command not a menu setting.
I think it's more that people expect 4 or 5 hundred pound graphics card to be able to run something like that on it's highest settings a lot better than it does.
How much better does Fallout 4 really look on PC over console?
We are told that the PC's we spend 3 or 4 times more on are that many times more powerful than consoles but we get a negligible visual improvement that put's performance in the gutter.
I look at games like Alien Isolation and see these effects running while my fps is sitting close to triple figures.
http://i1-news.softpedia-static.com...quel-Ideas-Won-t-Focus-on-Action-467332-4.jpg
http://www.psmania.net/wp-content/uploads/2015/02/15103768993_64e1e8dae7_k.jpg
Even modded to max it out the hit is smaller than what we are getting given by AAA dev's lately.
My understanding is that uGrids is a config setting you need to edit an ini for or possibly use the console. I believe it basically decides how far around you get rendered in full detail. So with 5 you get stuff rendered if it's close but things on the other side of the water would have reduced detail. If you turn it up to 7 it will fully render things in the water, 11 would render building sat just on the other side of the water and 13 would do things a distance the other side of the water. There was actually a good example on Nvidia's Tweak guide.
In fairness Nvidia's Tweak guide made quite clear the performance hit that you'd get from these effects. Not sure why they took it down.
I agree that it's silly including effects that even on Nvidia's graphs show they will impact fps to an almost unplayable level (depending on the person I guess). Maybe if the game supported multi-gpus there'd be a reason to have it. Give those people a little something to do with all the extra power they had. Still I think multi-gpu is a Bethesda issue.
I remember the first Rome: Total War had some settings to do with unit size (I think) that determined how many units were rendered on screen in the battle mode that had settings allowing you to set it higher than current GPUs could manage and they said that it was to future-proof for new hardware (think it was in a tool-tip for the setting). I thought at the time it was a bit silly and ambitious planning that far ahead. Of course seeing how Creative Assembly's Total War games have gone on it now seems like that could've been an excuse!
Remember that the console version runs at 1080p 30fps with common dips into the 20s and on the XBOX occasional stutters where the fps hits 0.