Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Going from Nvidia to AMD I noticed it straight away. Colours seem stronger on AMD and textures a bit sharper. I've switched between Nvidia and AMD cards a number of times and the difference is there. You can tweak geforce settings but you can't quite get it to match, if you try to add vibrance for stronger colours you get banding which doesn't appear on AMD.
Why hasn't nvidia fixed these defects in its image quality 2D/3D in so many years yet? Is it simply a corporate strategy to offer bad image quality or just lack of know-how/patents?
Probably isn't a defect and is intentional. A lower IQ means better performance.
I'd like to see if it's the same on Quadro cards, I suspect it isn't.
I suspect that the Nvidia pipeline in parts compresses data to improve performance and that in these sort of cases the compression is ,however slightly, affecting the edge sharpness of text. The compression being good enough unless you look hard.
The poor quality is due to the RFI filters installed on Geforce video cards. Some cards are worse than others.
There is a modification that can be done to improve the quality. It involves cutting out 3, 6, or 9 capacitors (depends on your video card) and bypassing (or cutting and retracing) 3 or 6 inductors.
You can also bypass the whole filter circuit by solding 3 wires. There is one filter for each of the RGB signals.
The first method is easier because it can be done without soldering. You pop off the capacitors and use conductive paint to bypass the inductors (actually you just carefull paint the top of the inductors). If you want, you can also cut out the inductors but this leaves the circuit open and you have to close the connection (You can still use conductive paint). You can even skip the painting step since cutting the capactors provides about 80% of the improvement and doing just this much makes for a zero cost modification.
The second method involes soldering but it is reversible and probably less risky depending on your soldering skills. My skills aren't very good so I just clipped off the capacitors and painted the inductors on a Geforce256, a Geforce2 GTS-V, and also an old ATI All-in-Wonder.
Be warned. There is a risk. Removing the capacitors could damage the underlying traces. If this were to happen the only fix would moving on to the second method and soldering in a little bypass.
https://forums.tomshardware.com/threads/geforce-image-quality.876820/Hmmm... This seems an interesting thread. And actually, this is something that is 'bothering' me for a while, too. At home we use (okay, I know, it is outdated, but still ...) a Diamond Viper V770 (TNT2 Ultra based). It gives quite sharp images on our Iiyama Vision Master Pro 450 (19") at all resolutions up to 1280x1024. Only in the resolution we use it at, 1600x1200, it gets a little blurry. Before we had the Diamond, we had a no-brand TNT2 ultra card, and there quality was even worse. Although it wasn't bothering, I wondered ...
Nowadays everybody is talking about the tremendous speeds of the most recent videocards and sometimes even results of benchmarks about 'image quality' are shown. And that, I think, is weird. First of all, your monitor is a very crucial thing, and on the second hand, there is the video cards RAMDAC and accompaning circuitry (what you are talking about) that determine a lot.
YCbCr 444 is supposedly better for movies since they use that format afaik, but games should look best with full RGB.
AMD still gives a much better IQ as Nvidia choose to fudge certain compression values to give the impression of more FPS for the same price point.....but Nvidia still have more stable drivers with older releases on earlier Direct X series so its always a compromise between the 2.
I feel like if this was an issue or as big of a difference as some here suggest, the internet would have been all over it. Gamers Nexus would have been the first to **** on Nvidia if it were true.
If it is a "much" different I pretty much guarantee it is end user configuration error or misunderstanding and not comparing like for like settings. I've compared AMD and nVidia systems in the past side by side on the same model of monitor and any differences are close to imperceptible - usually just nudge digital vibrance on nVidia by like 2-3% and you can't tell which is which.
There has been the odd driver bug and IIRC a couple of browser issues recently where fonts weren't being rendered properly on nVidia but AFAIK those are currently fixed in the latest versions.
No they don't. Movies use or rather your players use 4:2:2. It's really mastered in 4:2:0 when I read about it.
The PC space uses 4:4:4.