• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Permabanned
Joined
2 Sep 2017
Posts
10,490
Going from Nvidia to AMD I noticed it straight away. Colours seem stronger on AMD and textures a bit sharper. I've switched between Nvidia and AMD cards a number of times and the difference is there. You can tweak geforce settings but you can't quite get it to match, if you try to add vibrance for stronger colours you get banding which doesn't appear on AMD.

Why hasn't nvidia fixed these defects in its image quality 2D/3D in so many years yet? Is it simply a corporate strategy to offer bad image quality or just lack of know-how/patents?
 
Soldato
Joined
22 Nov 2006
Posts
23,375
Why hasn't nvidia fixed these defects in its image quality 2D/3D in so many years yet? Is it simply a corporate strategy to offer bad image quality or just lack of know-how/patents?

Probably isn't a defect and is intentional. A lower IQ means better performance.

I'd like to see if it's the same on Quadro cards, I suspect it isn't.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Probably isn't a defect and is intentional. A lower IQ means better performance.

I'd like to see if it's the same on Quadro cards, I suspect it isn't.

But it's quite strange if they pursue performance in desktop 2D mode?
They cheap out on some elements on the circuit boards, hence the lower image quality in 2D... That's a problem since year 2000 and perhaps before.
 
Associate
Joined
22 Jun 2018
Posts
1,582
Location
Doon the watah ... Scotland
I suspect that the Nvidia pipeline in parts compresses data to improve performance and that in these sort of cases the compression is ,however slightly, affecting the edge sharpness of text. The compression being good enough unless you look hard.

Edit:

Its piqued my curiosity this. Reading up on how windows renders its desktop is quite interesting. It appears that each program writes its display data to a buffer in the GPU memory, and then the windows system uses the GPU to take all those buffers and render them to form a final image. ( i.e. deciding whats on top, transparencies etc )

If i understand it correctly, the desktop is considered a Direct3D surface, onto which the final image is placed like a texture. It all ends up being a D3D surface, irrespective of whether its 2d stuff like text or not.

All the calls to create the image buffers are through API's ... so whilst a program like MS Word will consistently request a piece of text to be displayed via a set API each time, it'll be down to the GPU to interpret that and come up with the final image and render it out to a D3D surface. So its very valid nVidia and AMD will produce slightly different results.

Its well documented that you can compress textures to improve performance in GPU's. If the 2D desktop still runs through the 3D pipeline regardless, that to me adds to the notion that if nVidia use compression regardless to improve performance in its pipeline, then that may well be occuring for 2D as well, leading to the softness people comment on.

And that AMD may be using a different method ( or less compression ) leading to a crisper image.
 
Last edited:
Permabanned
Joined
2 Sep 2017
Posts
10,490
I suspect that the Nvidia pipeline in parts compresses data to improve performance and that in these sort of cases the compression is ,however slightly, affecting the edge sharpness of text. The compression being good enough unless you look hard.

I don't think it is the driver. It is the hardware on the PCB.

The poor quality is due to the RFI filters installed on Geforce video cards. Some cards are worse than others.

There is a modification that can be done to improve the quality. It involves cutting out 3, 6, or 9 capacitors (depends on your video card) and bypassing (or cutting and retracing) 3 or 6 inductors.

You can also bypass the whole filter circuit by solding 3 wires. There is one filter for each of the RGB signals.

The first method is easier because it can be done without soldering. You pop off the capacitors and use conductive paint to bypass the inductors (actually you just carefull paint the top of the inductors). If you want, you can also cut out the inductors but this leaves the circuit open and you have to close the connection (You can still use conductive paint). You can even skip the painting step since cutting the capactors provides about 80% of the improvement and doing just this much makes for a zero cost modification.

The second method involes soldering but it is reversible and probably less risky depending on your soldering skills. My skills aren't very good so I just clipped off the capacitors and painted the inductors on a Geforce256, a Geforce2 GTS-V, and also an old ATI All-in-Wonder.

Be warned. There is a risk. Removing the capacitors could damage the underlying traces. If this were to happen the only fix would moving on to the second method and soldering in a little bypass.

Hmmm... This seems an interesting thread. And actually, this is something that is 'bothering' me for a while, too. At home we use (okay, I know, it is outdated, but still ...) a Diamond Viper V770 (TNT2 Ultra based). It gives quite sharp images on our Iiyama Vision Master Pro 450 (19") at all resolutions up to 1280x1024. Only in the resolution we use it at, 1600x1200, it gets a little blurry. Before we had the Diamond, we had a no-brand TNT2 ultra card, and there quality was even worse. Although it wasn't bothering, I wondered ...
Nowadays everybody is talking about the tremendous speeds of the most recent videocards and sometimes even results of benchmarks about 'image quality' are shown. And that, I think, is weird. First of all, your monitor is a very crucial thing, and on the second hand, there is the video cards RAMDAC and accompaning circuitry (what you are talking about) that determine a lot.
https://forums.tomshardware.com/threads/geforce-image-quality.876820/

RAMDAC, RFI filters, capacitors, inductors.......
 
Associate
Joined
26 Jun 2015
Posts
669
Everytime I've gone between the 2 vendors, AMD has always given me a better image quality.

I know some people want to play at very high frame rates and thats fine but I prefer being able to tweak ingame settings in order to achieve this as opposed to the hardware churning out IQ that I have no control of.

As long as the GPU maintains my monitor frame rate (75 right now) then IQ takes prio, which is why I won't touch Nvidia.

People need to stop treating GPUs like CPUs, since an image does get processed through a pipeline so the hardware does in fact influence the quality of the end result ( for gaming that is ).

I've seen some reviewers who benched side by side comment on this before but no one has ever delved into a full review on this matter, be good if they did.
 
Associate
Joined
22 Jun 2018
Posts
1,582
Location
Doon the watah ... Scotland
4k...

That makes sense for analogue , but hdmi or display port is digital, so what the monitor displays is what the GPU has sent , (assuming no data loss).

So if all hardware is the same other than GPU, if 2 GPUs display the same information differently, that means they have rendered it differently before sending down the display cable.

So it does seem to me that Nvidia and and do things differently when rendering the various API calls.
 
Soldato
Joined
8 Dec 2005
Posts
10,542
AMD still gives a much better IQ as Nvidia choose to fudge certain compression values to give the impression of more FPS for the same price point.....but Nvidia still have more stable drivers with older releases on earlier Direct X series so its always a compromise between the 2.
 
Associate
Joined
22 Jun 2018
Posts
1,582
Location
Doon the watah ... Scotland
Yup.... It should be something that's discussed more. FPS is not the only metric.

Then again how would you quantify the clarity such that you could compare it measurably and not just go on the personal opinion that they 'feel' it looks better.

I do remember years ago that game reviews would pixel peep different cards from ATI and Nvidia for the likes of quake etc as there were observable differences in the final image.
 
Last edited:

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
YCbCr 444 is supposedly better for movies since they use that format afaik, but games should look best with full RGB.

No they don't. Movies use or rather your players use 4:2:2. It's really mastered in 4:2:0 when I read about it.

The PC space uses 4:4:4.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,141
AMD still gives a much better IQ as Nvidia choose to fudge certain compression values to give the impression of more FPS for the same price point.....but Nvidia still have more stable drivers with older releases on earlier Direct X series so its always a compromise between the 2.

If it is a "much" different I pretty much guarantee it is end user configuration error or misunderstanding and not comparing like for like settings. I've compared AMD and nVidia systems in the past side by side on the same model of monitor and any differences are close to imperceptible - usually just nudge digital vibrance on nVidia by like 2-3% and you can't tell which is which.

There has been the odd driver bug and IIRC a couple of browser issues recently where fonts weren't being rendered properly on nVidia but AFAIK those are currently fixed in the latest versions.
 
Soldato
Joined
6 Feb 2019
Posts
17,588
I feel like if this was an issue or as big of a difference as some here suggest, the internet would have been all over it. Gamers Nexus would have been the first to **** on Nvidia if it were true.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,141
I feel like if this was an issue or as big of a difference as some here suggest, the internet would have been all over it. Gamers Nexus would have been the first to **** on Nvidia if it were true.

People like Hardware Unboxed have done tests on it in the past and found largely no difference and some minimal edge cases - usually in AMD's favour but only slightly.

9 times out of 10 the stuff posted on Reddit, etc. they aren't comparing like for like sometimes due to misunderstanding what settings actually do.
 
Soldato
Joined
7 Feb 2015
Posts
2,864
Location
South West
Have a friend with multiple generation of AMD and Nvidia card, even doing a double blind test with the same screen you can tell the difference. And this is over displayport, so nothing to do with the colour range default on the HDMI output.
 
Permabanned
Joined
28 Nov 2006
Posts
5,750
Location
N Ireland
No offence to anyone but i do not buy it and have never seen proof either. A lot of people who swap are also as said by Rroff do not understand Nvidia settings. I use HQ mode clamped with DSR and it looks stunning at 4k.


First of all, The two sample optimizations can only be disabled in high quality mode. And to do this you need to enable specific profiles i do not think a global profile cuts it?


Anyways in HQ mode up from Quality being the highest Nvidia actually let you slide to upon drivet install, You need to use Clamp mode with AF x16 im afraid. And there are two settings for sample,mipmap optimizations which if read state OFF for best image quality. I did a post on this i know all the settings and that is the highest IQ mode Nvidia have. And sometimes it even reverts the optimization setting so i found it best after seytings are done save reboot and double check if they are there after this when you load the game is sticks and it could be a bug.
 
Soldato
Joined
22 Nov 2006
Posts
23,375
If it is a "much" different I pretty much guarantee it is end user configuration error or misunderstanding and not comparing like for like settings. I've compared AMD and nVidia systems in the past side by side on the same model of monitor and any differences are close to imperceptible - usually just nudge digital vibrance on nVidia by like 2-3% and you can't tell which is which.

There has been the odd driver bug and IIRC a couple of browser issues recently where fonts weren't being rendered properly on nVidia but AFAIK those are currently fixed in the latest versions.

But as I said earlier, if you increase vibrance you get colour banding which doesnt appear on AMD cards. So its more than just that.
 

R3X

R3X

Soldato
Joined
9 Aug 2013
Posts
3,553
No they don't. Movies use or rather your players use 4:2:2. It's really mastered in 4:2:0 when I read about it.

The PC space uses 4:4:4.


So Full RGB mode 4:4:4 for PC display is still the best ?

I have always used that setting for my 4K TV panel and felt it has shown the widest and most accurate colours, blacks being deep black purely for film/tv shows but could be wrong.
 
Back
Top Bottom