Caporegime
- Joined
- 18 Oct 2002
- Posts
- 33,188
IT really isn't, there are basically no games that have an ultra setting that uses more memory that actually offers higher IQ. Xcom 2 and Shadows of Mordor both have identical image quality with ultra textures or the next setting down. Ultra is just uncompressed textures, nothing more or less, for the express purpose of using more memory, NOT improving IQ.
For the most part the reason to use two cards is to increase framerate, not IQ. A single 290x can get 40-50fps on The Division at 1440p at what I consider highest IQ(so DoF and other crap disabled because they make the image look worse, not better). If and when my second card starts working properly I will use the same settings(what I consider to be max) and aim for 100fps, I won't randomly use 8xSSAA, because it will make little difference.
Same goes for Shadows of Mordor, I go for a higher framerate and maximum IQ settings which means not enabling ultra textures, with 8GB cards I wouldn't enable ultra textures, using more memory for no reason is pointless, though it does mean transferring more data which likely reduces performance slightly for no benefit.
The massive majority of users want increased performance at a higher resolution rather than increased IQ because for most people the option is getting between 40-60fps at max IQ, or going for 80-120fps at max IQ, or increasing resolution with extra cards.
There are no games I've seen that 2x 8GB 390's beat 4GB 290x's or Fury X's unless you enable non IQ increasing settings. There will be in the future, but there are none I've seen yet.
Hell, Xcom 2 looks like crap for the performance you get regardless of the settings you choose, that a very basic looking game can use that much memory.... well, lets just say two Nvidia gameworks games use more than 4GB of memory for no IQ benefit and one of those games looks terrible for that extra memory usage. It should give you a idea as to why the uncompressed textures were added.... when AMD's top tier cards had 4GB of memory and Nvidia's had 6 or 12GB.
For the most part the reason to use two cards is to increase framerate, not IQ. A single 290x can get 40-50fps on The Division at 1440p at what I consider highest IQ(so DoF and other crap disabled because they make the image look worse, not better). If and when my second card starts working properly I will use the same settings(what I consider to be max) and aim for 100fps, I won't randomly use 8xSSAA, because it will make little difference.
Same goes for Shadows of Mordor, I go for a higher framerate and maximum IQ settings which means not enabling ultra textures, with 8GB cards I wouldn't enable ultra textures, using more memory for no reason is pointless, though it does mean transferring more data which likely reduces performance slightly for no benefit.
The massive majority of users want increased performance at a higher resolution rather than increased IQ because for most people the option is getting between 40-60fps at max IQ, or going for 80-120fps at max IQ, or increasing resolution with extra cards.
There are no games I've seen that 2x 8GB 390's beat 4GB 290x's or Fury X's unless you enable non IQ increasing settings. There will be in the future, but there are none I've seen yet.
Hell, Xcom 2 looks like crap for the performance you get regardless of the settings you choose, that a very basic looking game can use that much memory.... well, lets just say two Nvidia gameworks games use more than 4GB of memory for no IQ benefit and one of those games looks terrible for that extra memory usage. It should give you a idea as to why the uncompressed textures were added.... when AMD's top tier cards had 4GB of memory and Nvidia's had 6 or 12GB.


Do you understand that you can save images and compare them not using the graphics card or even owning the game at all.