Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Outshines everything, Polaris
The future is brighter with AMD
In a years time would you like to buy a 4gb card, I may even give you a special discount off one of mine but I suspect in a years time you will have an 8gb card lol.
As to games that use more than 4gb I have posted the GPUZ screenshots on a number of occasions for various games, perhaps if you called your Watch Dogs off you may spot them. People could be committing Grand Theft Auto right in front of you and you would not know what is going on.![]()
Outshines everything, Polaris
The future is brighter with AMD
In a years time would you like to buy a 4gb card, I may even give you a special discount off one of mine but I suspect in a years time you will have an 8gb card lol.
That's a substantial shift away from arguing that it matters today, however.
I've seen this from you on previous occasions and the response is the same. Just because memory is filled, doesn't mean it is needed. Caches (whether disk, GPU memory, browser) aren't normally cleared proactively all the time. Doing so would harm performance. Parallel example: there are things in my browser cache from websites I haven't visited in months, yet they still count towards the size of it.
Only measurable differences in output based on memory (and beyond margin of error) show 8 or 12GB provide benefits. A screenshot showing a filled RAM does not.
Important news flash shocker, in future more VRAM will be required. You keep moving the goalposts here as nobody is arguing that in future more VRAM will be required. The fact that quad SLI or Xfire has the grunt to push modern games beyond 4GB with unrealistic settings at 4K does not prove your point. There has always been VRAM problems as you push graphical settings to extremes.
Here is a quote from you showing the exact opposite stance on this argument when another poster argues 2GB is not enough for 1440p a few years back.
https://forums.overclockers.co.uk/showpost.php?p=24554987&postcount=21
Here is another where you claim the developers of Watch Dogs are having a laugh because they state 3GB VRAM is not enough.
https://forums.overclockers.co.uk/showpost.php?p=26358738&postcount=48
Yet you now claim that some devs are bad so we need as much VRAM as possible to counter this and that more VRAM is better.
My stance has always been the more VRAM the better especially if you plan multiGPU but the fact remains "currently" 4GB at 4K is fine and for 1080p it is plenty with single GPU. If you plan multi GPU then the more VRAM the better.
My own stance on 2GB VRAM limits. Ironically I have also contradicted myself somewhat here and it seems we have swapped roles slightly since 2013.
https://forums.overclockers.co.uk/showpost.php?p=26364789&postcount=78
Try running XCOM 2 on a single GTX 980 Ti or Fury X maxed @2160p, it is not a nice experience as you run out of memory. Using a TX on the other hand is quite playable.
Doesn't the fact that 4gb's is not enough to run ROTTR with very high textures at 1080p let alone 4k highlight the trend?
I myself am so looking forward to some people's opinions and how they might change depending on the amount of memory on the new AMD cards.
Personally I don't understand how people can argue about max settings. If a dial goes up to 11, then 10 isn't the max. If a car will do 200 mph max then 195 isnt flatout.
The point being if a game has a setting that is above the previous setting then the previous one isn't maxed. It doesn't matter if the difference is visible to one person or even 100 people if a single individual can tell the difference then there is a difference simple as that.
As for 4GB at 4k. Just about enough for 'most' things but won't be going forward.
Max settings != max IQ
I may have missed it but I don't think anyone is arguing that max isn't max. It's that turning on an option that degrades IQ yet impacts performance is a pointless setting.
Chromatic aberration (the clue is in the name)
Depth of Field (deliberately rendering out of focus)
Lens Flare
Vignetting (tunnel vision)
None of these settings reflect how the human eye works in real life and are image degradation effects that are only seen with photo/camera lenses. You are in effect deliberately making your game look worse and in all cases at a massive performance hit. IMHO of course.
I myself am so looking forward to some people's opinions and how they might change depending on the amount of memory on the new AMD cards.
Personally I don't understand how people can argue about max settings. If a dial goes up to 11, then 10 isn't the max. If a car will do 200 mph max then 195 isnt flatout.
The point being if a game has a setting that is above the previous setting then the previous one isn't maxed. It doesn't matter if the difference is visible to one person or even 100 people if a single individual can tell the difference then there is a difference simple as that.
As for 4GB at 4k. Just about enough for 'most' things but won't be going forward.
These days rendering static and ridged geometry detail to the point where its form is indistinguishable from real objects is becoming fairly standard.
Lighting and Shading is also getting close to photo real
The next step is getting the look and texture of surfaces to look real, aside from Lighting and Shading there is also the use of multiple texture layers, for bump mapping, specular light reflection, occlusion, dirt mapping ecte.... and more importantly for GPU Buffering the resolution of those textures, over the years we have gone from 0.5K to 1K, 2K, 4K and 8K textures that get ever sharper and more detailed and bigger.
They all need to be stored in memory, the bigger they are the more memory you need.
Game devs are pushing 4K textures as a matter of course these days. so not only are textures demanding more buffer they are also in multiples for layering on the same surface.
4GB Buffer is pretty minimal for 1440P or even 1080P in the very near future. never mind 4K
I'm confused, does this mean neither polaris 10 or 11 will replace fiji in terms of performance?