I might be able to get one of the following two configurations for a comparison based on frametime analysis, and hopefully to be able to show that vram shortage can increase the probability of lag spikes / stuttering. In many cases, the traditional benchmark results of "average fps" and "adhoc min fps" fail to show such phenomena. Techreport used such methodology to investigate micro-stuttering. (Don't confuse the old skool stuttering I'm saying here, with micro-stuttering caused by AFR. That's two different things.)
1) GTX 560 Ti 1GB x 2 SLI vs GTX 560 Ti 2GB x 2 SLI
2) HD 6950 1GB x 2 CF vs HD 6950 2GB x 2 CF
Which comparison would you guys prefer, if I can manage to get?
Also, which games would you guys prefer? Would BF3 (final release, not beta) be a good, persuasive option?
The tests would be done under 1920 x 1200 (or 1920 x 1080) resolution. Would those defenders of 1GB demand me to test on 4AA instead of 8AA under native resolution? (i.e. does 8AA improve IQ over 4AA?)
What framerate would you guys regard as playable? 30 fps or 60 fps?
1) GTX 560 Ti 1GB x 2 SLI vs GTX 560 Ti 2GB x 2 SLI
2) HD 6950 1GB x 2 CF vs HD 6950 2GB x 2 CF
Which comparison would you guys prefer, if I can manage to get?
Also, which games would you guys prefer? Would BF3 (final release, not beta) be a good, persuasive option?
The tests would be done under 1920 x 1200 (or 1920 x 1080) resolution. Would those defenders of 1GB demand me to test on 4AA instead of 8AA under native resolution? (i.e. does 8AA improve IQ over 4AA?)
What framerate would you guys regard as playable? 30 fps or 60 fps?
Last edited: