• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How to convince myself to go red

What? The 770 competes with and mostly beats the 7970ghz edition. The 7950 IS the lesser of the two cards. The 7950 competes with the 670 and 760?

Sure is. But if you care about price for performance then 7950 is where it is at. At least it was when I sold the games I got with my one, lol. 770 costs a lot more, one would think it better be, well, better... :D

ashmanuk67 - I was not trying to be rude, apologies if it came across like that. Just that for years now it is known that IQ is same on both. When I say confused, I am assuming you enabled a form of aa or certain setting on the nvidia card which you never did on the amd card, hence the difference perceived. I understand about smoothness though, that is why I never go crossfire or even sli, even though sli is better.
 
What? The 770 competes with and mostly beats the 7970ghz edition. The 7950 IS the lesser of the two cards. The 7950 competes with the 670 and 760?

He is actually correct but with one caveat, I would add the GTX670, GTX680 and HD 7970 into that "very little difference" bracket.

I have owned or tested 4x GTX680s, 6x HD 7970s, 4x HD 7950s and 2x GTX670s. Once you overclock to a decent level it becomes practically impossible to tell the difference in performance when there is only a ~10% delta between the cards. Considering OC potential, a GTX770 is no faster than a GTX680 or HD 7970. So at worst a similarly max overclocked HD 7950 or GTX670 will be ~10% slower.

I cut my losses and sold a GTX680 Lightning that could do 1350 core clock and 8GHz VRAM because a much cheaper HD 7950 at 1230/1750 was less than 10% slower, or in some cases trading blows with it.

Turn off the FPS counter and it becomes almost impossible to tell the difference between the 50 or 54 FPS.
 
Last edited:
The whole Nvidia/AMD IQ thing appears to be due with colour saturation (so I recall reading). Having used both side's cards that seems to resonate.
I also remember reading (seperately) that there is a difference (acclimatisation/cultural- however you wish to describe it as opposed to anything intrinsic) between asian and western preferences towards colour saturation with us preferring or used to more 'vibrant' colour. How true is it or even relevant I couldn't say but it is interesting to note.
 
Turn off the FPS counter and it becomes almost impossible to tell the difference between the 50 or 54 FPS.

Indeed. But people become obsessed. I never, ever run FPS counters unless I notice a problem. And even then I am loathe to because I know it will put me off.

I mean, that's the sole reason to buy a high end GPU or GPUs so that you don't have to count frames.
 
Trying to stay neutral here, but it is rather annoying when people pretend to not be a fanboy of one flavour brand but come out with some nonsense.

Anyway to put my slant on it.. when I'm not gaming I crunch. So whether it be mining coins or folding proteins AMD wins here so this is currently where my money would swing.

Interested to see what AMD release to counter the Titan/770 cards. Me - I am a value user (I dont need top resolution and silly fps) so buy based on bang for buck always.
 
Going from a 5870 to 580, I noticed no difference in colours, image crispness, etc.

I think people need to read about placebo.

It's not actually better colour, it's literally slightly differing colour settings at the absolute most.

Think colour saturation slider being a bit more to the right or left on one or the other.
 
It's not actually better colour, it's literally slightly differing colour settings at the absolute most.

Think colour saturation slider being a bit more to the right or left on one or the other.

:confused::confused: How do you know this is not due to your monitor...or the current drivers...or maybe you got too much sun today? In any case color differences can be adjusted.
 
If you sit down an AMD card and an nVidia card side by side on the same monitor and swap between sources - which I've done more than once - the AMD card does look more vivid for whatever reason.

In terms of image quality and detail you'd probably see "new" but different stuff whichever way you swapped - its fairly subtle but to use a scene from dishonored as an example a selection of barrels against a wall on nVidia they all look to be very similiar if not the same color at a quick glance but the small details like seams and patterns in the wood stand out a little more whereas on an AMD card you can see that individual barrels actually have a slightly more unique color one being slightly yellower another slightly redder than others, etc. but the smaller details of the seams, etc. are less distinct at a casual glance as the edges are a little less crisp than on nVidia. TBH not really sure the reasons why and it is very subtle you'd only really notice comparing side by side or if you've just come directly from one to the other.
 
:confused::confused: How do you know this is not due to your monitor...or the current drivers...or maybe you got too much sun today? In any case color differences can be adjusted.

This is what I was saying... :confused:

If you sit down an AMD card and an nVidia card side by side on the same monitor and swap between sources - which I've done more than once - the AMD card does look more vivid for whatever reason.

In terms of image quality and detail you'd probably see "new" but different stuff whichever way you swapped - its fairly subtle but to use a scene from dishonored as an example a selection of barrels against a wall on nVidia they all look to be very similiar if not the same color at a quick glance but the small details like seams and patterns in the wood stand out a little more whereas on an AMD card you can see that individual barrels actually have a slightly more unique color one being slightly yellower another slightly redder than others, etc. but the smaller details of the seams, etc. are less distinct at a casual glance as the edges are a little less crisp than on nVidia. TBH not really sure the reasons why and it is very subtle you'd only really notice comparing side by side or if you've just come directly from one to the other.

Yeah, no. This is some serious pseudo analysis.
 
Ok yes the guy may be confusing the new card seeming to have better image quality, but how anyone can argue "there's absolutely no difference to image quality apart from colours may look slightly different" is beyond me.
The sentence itself is a contradiction, the image is either the same or its different, it cannot be both at the same time.:rolleyes:

It's simple, there's a difference between image quality and colour saturation...
 
I think some of it might be in the nature of the filtering i.e. AF/generated mipmaps - although AMD always seem to look a little more vivid overall even on the desktop. But I wonder if the difference ingame is due to the texture filtering on AMD biasing towards preserving color detail whereas on nVidia biases towards sharpness or something like that (i.e. something along the lines of chroma v luma).
 
Last edited:
If you sit down an AMD card and an nVidia card side by side on the same monitor and swap between sources - which I've done more than once - the AMD card does look more vivid for whatever reason.

In terms of image quality and detail you'd probably see "new" but different stuff whichever way you swapped - its fairly subtle but to use a scene from dishonored as an example a selection of barrels against a wall on nVidia they all look to be very similiar if not the same color at a quick glance but the small details like seams and patterns in the wood stand out a little more whereas on an AMD card you can see that individual barrels actually have a slightly more unique color one being slightly yellower another slightly redder than others, etc. but the smaller details of the seams, etc. are less distinct at a casual glance as the edges are a little less crisp than on nVidia. TBH not really sure the reasons why and it is very subtle you'd only really notice comparing side by side or if you've just come directly from one to the other.

Not often i agree with the mighty Rroff but +1. That is basically what i was trying to say earlier as ive done something similar before.
 
Back
Top Bottom