• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

nbzEvgw.gif

Apt GIF for this thread.

I suppose this is the Graphics Card section after all.
 
Looking at the frame rates I would say the bottom images are on some sort of prehistoric GPU
Looking at the games, I would say indeed. He was using an 8500GT for the NVidia card, which was released back in April of 2007 and god knows when support for that card was dropped. The test also has no settings shown, no driver info, no date, no GPU. It is actually a terrible comparison to post. At least use something more recent to make such claims.
 
Looking at the games, I would say indeed. He was using an 8500GT for the NVidia card, which was released back in April of 2007 and god knows when support for that card was dropped. The test also has no settings shown, no driver info, no date, no GPU. It is actually a terrible comparison to post. At least use something more recent to make such claims.

CS Source, maximum settings, date of last modification of the screenshot is 2 December 2008, the driver must be the latest GeForce driver release at that moment, the graphics card is 8500GT 512MB DDR2.
CS Source, maximum settings, date of last modification of the screenshot is 2 December 2008, the driver must be latest Catalyst driver release at that moment, the graphics card is HD 4670 512MB GDDR3.

Look, you can take RX Vega 64 and RTX 2080 Ti and test CS Source today. Post the images in this thread.
 
CS Source, maximum settings, date of last modification of the screenshot is 2 December 2008, the driver must be the latest GeForce driver release at that moment, the graphics card is 8500GT 512MB DDR2.
CS Source, maximum settings, date of last modification of the screenshot is 2 December 2008, the driver must be latest Catalyst driver release at that moment, the graphics card is HD 4670 512MB GDDR3.

Look, you can take RX Vega 64 and RTX 2080 Ti and test CS Source today. Post the images in this thread.
So 10 years old then. Talk about "during the war" :D

Honestly I run AMD and NVidia and can't notice a difference. On both I set Vibrance to +63 on my monitor, as that makes colours pop but I am sure some will see the placebo effect if they try hard enough.
 
So 10 years old then. Talk about "during the war" :D

Honestly I run AMD and NVidia and can't notice a difference. On both I set Vibrance to +63 on my monitor, as that makes colours pop but I am sure some will see the placebo effect if they try hard enough.

And the tested resolution is 1024 x 768.

About old things - change your monitor to a more recent one, I changed to LG 24UD58 and the colours are like on an OLED screen.
 
And the tested resolution is 1024 x 768.

About old things - change your monitor to a more recent one, I changed to LG 24UD58 and the colours are like on an OLED screen.
I have a Asus PG348Q and love it. I also have a 55" OLED LG TV and love that. Not sure if there is something newer that would benefit me?
 
At default I prefer NVidia as to me the colours are more realistic for gaming.

If you are playing one of the Battlefield games for example you don't want the bright punchy colours you get with AMD cards as it looks unrealistic for the game.

Having said that as other people have pointed out you can adjust the colour output of both vendors to look the same.
 
I thought that AMD had better colours out of the box though obviously you can tweak it. The real difference is HDMI nvidia has a deliberately limited colour gamut which again you have to turn off/unlock to get the full range.
 
For the record this is what AMD colour settings are by default, all balanced and colour temp meets the standard for what monitors should be calibrated at 6500k

600889012f391055e0fc49e44303bb2350939452bc5aa992d0c6925f2e9a558d589ada8c.jpg
 
I don't think AMD or NV produce a definititively better IQ in all cases. And I don't think you could ever prove it either. I think it's in the eye of the beholder as much as anything. If you sat 50 people in front of identical tvs, and asked them to produce the best picture, you'd get 50 different set-ups.

Look at Skyrim ENB's, some are garish, some love them some hate them, some are washed out and grim and they appeal and repel in equal measure.
 
CS Source: Dust 2 map
Video settings - all maxed out, anisotroping filtering 16X, AA 8x.
Ryzen 5 2500U with Vega 8 Graphics plus RX 560X 4GB GDDR5 switchable graphics.
Windows 10 Pro 1803
AMD driver 23.20.826.3072, date 16.06.2018, latest available on Acer's webpage.
1920x1080:



 
I thought that AMD had better colours out of the box though obviously you can tweak it. The real difference is HDMI nvidia has a deliberately limited colour gamut which again you have to turn off/unlock to get the full range.

But you can't tweak it and not have side-effects. It will cause colour banding.

AMD and Intel have the proper calibration. It's Nvidia who are off. It's like they are adding extra white to the picture.
 
CS Source: Dust 2 map
Video settings - all maxed out, anisotroping filtering 16X, AA 8x.
Ryzen 5 2500U with Vega 8 Graphics plus RX 560X 4GB GDDR5 switchable graphics.
Windows 10 Pro 1803
AMD driver 23.20.826.3072, date 16.06.2018, latest available on Acer's webpage.
1920x1080:
you need to be standing in the same spot as the lighting is totally different in both pics which doesnt help in a comparison, also im pretty sure a difference wouldn't show in a screen shot as surely it would be controlled by the card in situ on your machine. apologies if i missed something previously mentioned otherwise?
 
CS Source: Dust 2 map
Video settings - all maxed out, anisotroping filtering 16X, AA 8x.
Ryzen 5 2500U with Vega 8 Graphics plus RX 560X 4GB GDDR5 switchable graphics.
Windows 10 Pro 1803
AMD driver 23.20.826.3072, date 16.06.2018, latest available on Acer's webpage.
1920x1080:



Different areas and such an old system on both, a poor thing to use as a comparison.
 
But you can't tweak it and not have side-effects. It will cause colour banding.

AMD and Intel have the proper calibration. It's Nvidia who are off. It's like they are adding extra white to the picture.

It is for video content only.
 
Remember what you claimed?



How do those images demonstrate that.

Also how about something a bit more modern than CS source, which is 14 years old.

Some articles from reputable tech sites would help to prove your point, preferably relatively recent ones.

Else it is just like AMD having dodgy drivers, they may have done once and the stigma has stuck.

I stand with what I have said and people also seem to think the same:

I didn't read through the thread much but I'll say this, my GPU history:

1. GT 8800
2. GTX 580
3. 7970 (mining)
4. 7990 (more mining)
5. 290X (more mining)
6. 980 Ti
7. Vega 56

I have tested throughout these periods back and fourth many times, because I take ages to sell my old hardware and often run 2-3 rigs at a time side-by-side. I can say for sure that AMD has the following:

1. Better colour, at least by default.
2. Better detail accuracy (models, shadows) in many games.
3. Better frametimes/minimums, generally making games (especially higher framerates titles) feel much smoother.

An easy example is that in CS:GO, playing at 4K (one of the only games where you can hit 300+ FPS at 4K), the 980 Ti is noticeably stuttery, and colours are washed out. I switch to the Vega 56, and it is almost unbelievable, I can play it at 4K and it feels more smooth than running the 980 Ti at 540p@480hz (I have an X28 that can pull this off), while the darker areas create more "pop" allowing models to stand out where I would otherwise miss them on the 980 Ti. The same colour/"popping" exists across many titles.

I am in IT by trade and a programmer by hobby, so I can take guesses at where/why these things are happening, my guess is that Nvidia puts more effort into per-title optimization, which ultimately leads to detail quality/fidelity decreasing, because they are trying to strike a balance. You might set some Ultra setting for your shadows in XYZ title, but if Nvidia has reviewed the differences internally for XYZ title and decided that Ultra shadows provide very little benefit over High, while having substantial performance setbacks, then they will force High shadows and "lie" to you. It's understandable, to be clear I'm not trying to make it sound like a negative.

Then, for colours, I believe that Nvidia has issues with the cards defaulting to 6 bit or 8 bit, when 8 bit or 10 bit is available on the monitor side, I also believe that they do some heavy colour compression to save on performance/memory utilization, these two things combined with a more washed out default colour profile can explain the problem.

Finally, for the "smoothness" issue, I honestly believe that AMD has better drivers. Maybe it's their hardware pipeline, maybe it's some Windows issue, but I am absolutely sure it is there and the most simple explanation to me would be that the driver is more responsive or consistent and applications are prone to less clock-clock hitching/instability running on AMD.

I have ran pure Intel on the CPU side all these years, despite supporting AMD, and I have a predominantly Nvidia GPU history, especially if you exclude the time I spent mining on the 5000 and 7000 series. I was until months ago an investor in AMD too, since 2016. I'm not biased though, it really is as simple as noticing these differences when swapping out hardware, some people will notice, others won't, and for some maybe you truly don't see these benefits.





Different areas and such an old system on both, a poor thing to use as a comparison.

What?
 
you need to be standing in the same spot as the lighting is totally different in both pics which doesnt help in a comparison

The purpose of the exercise is that you stand in the same position with your nvidia graphics card and post here what you see there.
The two screenshots are on the same machine, just different places in the tunnel of Dust 2.
 
It's very easy to see when you have the machines side by side. My mate has a 290x on a 4k IIyama 4k sitting next to his main rig which has a 1080ti also hooked up to a 4k tn which was double the price of the IIyama. The comparison was in Rocket League which is a very colourful game so makes it easy to see. His kids who know not much about hardware thought his old machine was better because of the visuals. He himself was not impressed either. You would have to be blind not to see it when they are side by side. No amount of tweaking has made a big difference. I noticed it back when i had a 3870 when i swapped to a 8800gtx, was happy with more fps but the image looked way off to me. Those that mainly use Nvidia propably don't care but those of us that are long time Ati/AMD users can usually notice it straight away.
 
Back
Top Bottom