• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Image Quality, Nvidia 5xx vs AMD 6xxx

Associate
Joined
11 Sep 2010
Posts
436
Location
Brighton
There has been controvery over the image quality with AMDs default settings. Are reviews comparing these cards against each other on an equal footing?

AnandTech's review says "Finally, all tests were done with the default driver settings unless otherwise noted." Does that mean they're using a lower quality AMD setting even when they set each game in the test to maximum quality?

Here's a link to AndanTech's quality comparison image. Highlight the 3 graphic cards under the black and white image to see the picture change. Do you notice a difference in quality between the 480 image and the 6800 image?

Please don't turn this into a fanboy slagging match, if you have any good evidence of where IQ does or doesn't exist I'd be interested to see it.

Thanks
 
so that image is suposed to tell us what exactly?

if your trying to compare IQ settings using that its almost laughable
 
so that image is suposed to tell us what exactly?

if your trying to compare IQ settings using that its almost laughable
So why do Anand use it? I posted it because someone referred to it in a thread elsewhere, as an example of image quality. I don't have any good examples of image quality between the latest cards at equal settings.
 
On the crysis warhead benchmarks here, why do they use only gamer quality with enthusiats shaders, and not just enthusiast quality and enthusiats shaders?

Are they trying to say thatthere isnt any visible difference between enthusoast and gamer?
 
Thanks for the link, I'll check it out. I searched this forum for any thread with 'quality' in the title, but couldn't find what I was after.
 
So why do Anand use it? I posted it because someone referred to it in a thread elsewhere, as an example of image quality. I don't have any good examples of image quality between the latest cards at equal settings.

they arent compating Image Quality they are comparing the anti aliasing and AF ?

theres no textures or anything that could be effected by a crappy lod bias at driver level , no mipmaps?
 
I have just got my 6970 coming from 2 X 4870's and wow there is a big difference in image quality :eek:
I have never had Nvidia so sorry can not compare but from AMD's older cards to newer you can tell.

My CAT's are all on default "BALANCE" but i think it looks amazing, i only play Firsy Person Shooters and when i crank up CAT's i find it very hard to tell the difference so why bother :cool:
 
I have just got my 6970 coming from 2 X 4870's and wow there is a big difference in image quality :eek:
I have never had Nvidia so sorry can not compare but from AMD's older cards to newer you can tell.
This is the point really, that there can be a difference in image quality between cards, and a lot of us would say that's important. But reviews concentrate on FPS , which I assume is because the IQ between cards their comparing is similar. I'd just like to check.
 
The trouble is when opinion overides facts and im starting to question sources and benchmark sites because more i read the less comparable the results seem.

I don't think when gaming those differences can be noticed until you take a pic or freeze frame to compare.
 
Back
Top Bottom