https://www.youtube.com/watch?v=sxExRN1TcY8
timagamer
3 weeks ago (edited)
I've been testing your tip about changing texture setting from High to Ultra on my GTX 780 3GB with different settings, and it seems like 3072 MB is not enough for both RDR2 with Ultra textures AND Win10 system plus some definitely required apps like MSI Afterburner and Razer Booster, as those consume up to 500+ MB of VRAM with no game launched at all. With High texture setting (
quite poor and ugly compared to Ultra, as I see it) I've got like 33-52 fps at Medium-High shadow/light/lod settings, compared to 22-37 fps with just Ultra settings changed for comparison, no sudden stutters but overall decrease in framerate, to my own surpirse. Game runs from SSD, CPU is 4670K at 4.4GHz, 2x4=8GB Kingston HyperX at 2133 Mhz. This 3072 MB VRAM insufficiency is well seen in the video about RDR2 running on 1060 6GB, where 3200+MB is used at Medium graphics. I would like to investigate into this a bit. UPDATE: Tried decreasing overall VRAM load by manually shutting down desktop app and it worked, I've got the same 34-50 fps while using Ultra texture setting and 1680x1050 resolution, it seems like this game requires at least 2.8GB of VRAM exclusively for itself, 3GB more likely for resolutions like 1920x1080. UPDATE #2: The game is unstable while using Ultra textures on GTX 780 3GB, my VRAM optimisation have little to no effect on this issue, the game crashes after 5-15 mins of continuous gameplay
---
just 5 years after the mighty gtx 780's release, it is unable to compete with a ps4 in terms of texture fidelity. how surprising
and brutal truth;
"
quite poor and ugly compared to Ultra, as I see it)"
this is just not rdr 2 of course. gtx 770 and 780 played with inferior and ugly textures since 2016. since 2016, ps4 and xbox one still boasts a rock solid ultra textures and provides
superior texture fidelity
same will happen to 3070 and 3080 in relation ps5/xbox sx.
ac origins, nier automata, wolfenstein, rise of, shadow of tomb raider, countless AAA games run on low/medium textures with a 770 and 780 while ps4/xbox one provided ultra textures for all of them with their 8 gb total ram.
imagine buying a overly expensive 770 or 780 and 3 years later ps4/xbox one runs games with higher textures. yeah, same will happen to 3070 / 3080. whether you like it or not.
ps4/xbox one, devs manage to get ultra textuers fit in their memory budget somehow. i dont care how they do it. do it with a 780 770 and return to me. then we can discuss. you can't, tweak and optimize all you want, you can't. devs and nvidia will want you to move on from them. and you will. this is the actual discussion. if nvidian and devs are going to want you to move on from 8 gb 10 gb vram gpus, you will have to, eventually.
medium and high textures look UGLY and WAY worse than ultra. its a huge compromise that no other graphical setting can offset.
you can run everything ultra with 3080's extra "power". it is meaningless if it has to push medium/high textures along with them.
see, gtx 780 is 4-5 times faster than ps4. it may even do 1080p medium 35-45 fps maybe (hardly). but it will do so with low med textures lmao. and game looks horrible lmao.
what you do is the same thing 770 and 780 users did in 2013-2015. they boasted with their higher settings and higher fps in games. they boasted how they experience games better than ps4. how devs pushed low settings to accomodate for games to run ps4.
3 years later_? 770 and 780 became OBSOLETE. while ps4 still kept rocking. with HIGHER fidelity settings thanks to its higher memory budget
once 2016 and nextgen games started to hit the deck, all of them scurried to newer gen cards bcoz their puny 770s and 780s were unable to cope with nextgen high quality textures with high memory budget that ps4 had no problem dealing with.