So the current situation with 4k is:
Increasing AA at high resolutions is pretty much an exercise in massively increasing required VRAM and reducing performance so you get less "jaggies" at 4k, I don't notice them but ok.
As far as what gives the best experience goes the order is pretty much:
Running native resolution
Reasonable framerate/input lag
High-ish graphics setting apart from AA
High framerate monitor/framerate/low input lag/gsync/freesync
Some AA
Maxing most settings
Maxing every setting apart from AA
Maxing AA.
It's not until you get to the bottom couple of rungs that 4GB VRAM starts to get hit @ 4k.. and when it is getting hit, every other rung above that is more important is getting compromised massively.
So in answer to the question is 4GB VRAM enough- yes it is, unless you want a really horrible gaming experience that isn't caused by the lack of VRAM but your own stubbornness to max out settings in games.
Increasing AA at high resolutions is pretty much an exercise in massively increasing required VRAM and reducing performance so you get less "jaggies" at 4k, I don't notice them but ok.
As far as what gives the best experience goes the order is pretty much:
Running native resolution
Reasonable framerate/input lag
High-ish graphics setting apart from AA
High framerate monitor/framerate/low input lag/gsync/freesync
Some AA
Maxing most settings
Maxing every setting apart from AA
Maxing AA.
It's not until you get to the bottom couple of rungs that 4GB VRAM starts to get hit @ 4k.. and when it is getting hit, every other rung above that is more important is getting compromised massively.
So in answer to the question is 4GB VRAM enough- yes it is, unless you want a really horrible gaming experience that isn't caused by the lack of VRAM but your own stubbornness to max out settings in games.
but since that minimap does not get any quality enhancements either way I still stand by my point.
