• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Permabanned
Joined
2 Sep 2017
Posts
10,490
You can't explain anything, because you don't understand anything. You have shown time after time that you lack even basic knowledge of GPUs, CPUs, TVs and Monitors. When someone shows you that you are wrong you change the topic or bring up some article that's years out of date.

Your definition of proof is quoting random posts taken from internet forums. And even at that you get it wrong. For example, just look at the information you posted in post no. 793, you quote this post to show that NVidia's image quality is worse than AMD's but, and this shows how desperate you are, the post you took the information out of is actually proving that there is no difference in image quality.

So, I am asking you again, prove what you are saying. Show this massive difference in image quality across a range of games and applications.

You don't have a clue and I wish you would stop pretending that you do.

The conclusion there is wrong. I see a clear difference in the intensity of the black colour on the letters of the font, and also the background is whiter on the Radeon image, while warmer/darker white on the GeForce image.

And this is why I brought only the images, so everyone can compare with their own tools, rather than relying on a expected and so predictable conclusion.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,587
Location
Greater London
That is from your link, so clearly you are stating that there is no difference in IQ between AMD or NVidia. Thanks for clearing that up.

You can't explain anything, because you don't understand anything. You have shown time after time that you lack even basic knowledge of GPUs, CPUs, TVs and Monitors. When someone shows you that you are wrong you change the topic or bring up some article that's years out of date.

Your definition of proof is quoting random posts taken from internet forums. And even at that you get it wrong. For example, just look at the information you posted in post no. 793, you quote this post to show that NVidia's image quality is worse than AMD's but, and this shows how desperate you are, the post you took the information out of is actually proving that there is no difference in image quality.

So, I am asking you again, prove what you are saying. Show this massive difference in image quality across a range of games and applications.

You don't have a clue and I wish you would stop pretending that you do.
+1
 
Soldato
Joined
19 Dec 2010
Posts
12,031
As I follow up to my last post, I will show you what I mean when I say you don't understand anything. Here is a quote from your post

nvidia uses very bad texture compression which results in poor image quality. The texture compression is called ASTC and you can see what it does here:

Using ASTC Texture Compression for Game Assets
https://developer.nvidia.com/astc-texture-compression-for-game-assets

"Since the dawn of the GPU, developers have been trying to cram bigger and better textures into memory. Sometimes that is accomplished with more RAM but more often it is achieved with native support for compressed texture formats. The objective of texture compression is to reduce data size, while minimizing impact on visual quality."

This highlights how little you know. ASTC compression is not used on desktop GPUs. Support was added to openGL but no desktop GPU hardware supports it. It's mainly supported by mobile processors and some CPUs, an example would be the Tegra processor. Why did I specifically mention the Tegra processor as an example? Well, did you happen to look at what section of Nvidia's website that you were on? You were in the Nvidia Shield section of the developers website. The Nvidia shield uses a Tegra Processor which is why they were giving information about ASTC.

Just in case you don't understand what I just wrote. Here is a one line summary.

Nvidia's Desktop GPUs don't use ASTC Texture compression.

IN other words you are talking nonsense again. My advice to you would be to be to stop using Google search.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Your definition of proof is quoting random posts taken from internet forums.

Random posts which represent customers' feedback. If you're going to ignore them, you will lose profits because people will stop buying.

ASTC compression is not used on desktop GPUs.

Nvidia's Desktop GPUs don't use ASTC Texture compression.

What texture and 2D images compression is used by nvidia on desktop GPUs?
 
Soldato
Joined
22 Nov 2006
Posts
23,400
The difference is small enough that most people don't notice until they switch from one to the other. But Nvidia has an army of fanboys who won't, so they never do notice. Clearly the trade-off for more FPS is reduced image quality.

I have a geforce and an AMD card. The fact is there IS a small difference in the image quality whether you accept it or not.
 
Last edited:
Soldato
Joined
4 Jan 2009
Posts
2,682
Location
Derby
I do think that when you take screenshots on nvidia and paste it into paint the image looks more red / brown than what I see on my screen. That’s about all I can tell the difference too between Amd and nvidia.
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
I do think that when you take screenshots on nvidia and paste it into paint the image looks more red / brown than what I see on my screen. That’s about all I can tell the difference too between Amd and nvidia.

That's because Paint doesn't manage icc profiles.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,587
Location
Greater London
The difference is small enough that most people don't notice until they switch from one to the other. But Nvidia has an army of fanboys who won't, so they never do notice. Clearly the trade-off for more FPS is reduced image quality.

I have a geforce and an AMD card. The fact is there IS a small difference in the image quality whether you accept it or not.
I see a difference, but my feeling is this is more colour vibrance thing. Not so much a image quality thing.

Again, if you are certain, why not prove it by doing comparisons? You or no one else can, if it was possible website would be all over it to get all the clicks. If not the big ones than one of the smaller ones that do not have any contact with nvidia would do it to give their website a huge boost in clicks and get recognition.

Tests have already been done by many with the results not showing anything major. What people like 4K8K cannot comprehend is, there is no such thing as cheating in this case. They both have a way of arriving at the final image which is very different from each other. Simple as that. I gave an example before, if another company came in and managed to offer 99% of the image quality of existing cards and offered 2080Ti performance for £100, would anyone give a flying **** about the 1% difference in image quality anyways?

All I see is brand loyalty or fanboyism with these posts. I could not give a **** about either company personally. Only thing I will say is I would like to see a more healthy AMD, but that is only because it would benefit me, not because of some kind of loyalty. I just say it as I see it.
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,600
The difference is small enough that most people don't notice until they switch from one to the other. But Nvidia has an army of fanboys who won't, so they never do notice. Clearly the trade-off for more FPS is reduced image quality.

I have a geforce and an AMD card. The fact is there IS a small difference in the image quality whether you accept it or not.

you mean the difference is small enough that it doesn't matter? You should send your findings to Hardware Unboxed - they love ******** on Nvidia, surely if your claims have any validation they will jump right on this hot take of a story.

I personally did not notice any difference moving from a r290x to a 1080ti but hey w/e.

And any difference you think you see is so minor in the scheme of things - there are many other variables that deserve your attention and affect quality more like making sure you have the best Panel technology, making sure you can do HDR, making sure you have the best GPU so every game runs on Ultra - these things all have big impact Simon image quality - and it's not disputed it's a known fact.
 
Soldato
Joined
22 Nov 2006
Posts
23,400
you mean the difference is small enough that it doesn't matter? You should send your findings to Hardware Unboxed - they love ******** on Nvidia, surely if your claims have any validation they will jump right on this hot take of a story.

I personally did not notice any difference moving from a r290x to a 1080ti but hey w/e.

And any difference you think you see is so minor in the scheme of things - there are many other variables that deserve your attention and affect quality more like making sure you have the best Panel technology, making sure you can do HDR, making sure you have the best GPU so every game runs on Ultra - these things all have big impact Simon image quality - and it's not disputed it's a known fact.

But it's basically like reducing graphics detail slightly for more FPS. I guess you won't mind that right?
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Well, I'd try to explain it in another way. RTX 2080 Ti has as many as 272 texture mapping units, and a whooping 420 Gigatexel per second processing capability. And 11 GB of super fast GDDR6 frame buffer.

Despite all these, they are still using filters and compression algorithms which make the images nowhere near their raw or original quality.

Do you agree with me that nvidia should work either on compression-less rendering or on better compression algorithms?
Yet again, you are showing a lack of knowledge, as well as not even answering any question that is asked of you. Both NVidia and AMD use colour compression and that has been the case for as long as I know. You are still not reading anything and it is becoming quite poitless trying to teach you anything, as you will ignore in favour of Joe Bloggs on 'any forum' who states something you like.
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
The difference is small enough that most people don't notice until they switch from one to the other. But Nvidia has an army of fanboys who won't, so they never do notice. Clearly the trade-off for more FPS is reduced image quality.

I have a geforce and an AMD card. The fact is there IS a small difference in the image quality whether you accept it or not.
The trouble is, you are not calibrating NVidia properly, as when I run both and calibrate the colours to my liking, I wouldn't have a clue what was in my machine, other than frame rates. AMD for me has a better "out the box" image and needs very little change, so I am not entirely disagreeing with you but again, when both are properly calibrated, they both look the same.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
The difference is small enough that most people don't notice until they switch from one to the other. But Nvidia has an army of fanboys who won't, so they never do notice. Clearly the trade-off for more FPS is reduced image quality.

I have a geforce and an AMD card. The fact is there IS a small difference in the image quality whether you accept it or not.

No. You are flat out wrong.

What you are seeing is a slight difference in the default colour setups between AMD and Nvidia. That's it, nothing more, nothing less, you can calibrate both to look exactly the same.

That IS an actual fact, not your makey uppy one based on anecdotal evidence.

But it's basically like reducing graphics detail slightly for more FPS. I guess you won't mind that right?

If you are going to talk rubbish, then I can easily say it's mostly the AMD fanboys who are keeping this nonsense alive. As it's only AMD fanboys who seem to perceive this image quality difference.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Yet again, you are showing a lack of knowledge, as well as not even answering any question that is asked of you. Both NVidia and AMD use colour compression and that has been the case for as long as I know. You are still not reading anything and it is becoming quite poitless trying to teach you anything, as you will ignore in favour of Joe Bloggs on 'any forum' who states something you like.

You are showing lack of knowledge and honesty. If you were honest, you would say the texture compression has different levels and quality. AMD uses compressions techniques which give higher overall image quality. It's a fact and it has been proven dozens of times.
Also, you can't calibrate colours on parts of objects which don't exist on nvidia image, because they are cut, like tree branches, bushes foliage, etc.



 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,587
Location
Greater London
Oh christ the watchdogs screenshot again. It's a different texture entirely, the rest looks like same. 4k8kw10 was told that last time he brought that up.
He does not care what he is told. All he does is scour the Web looking for people that agree with his viewpoint and then adds them to his favouritea and pulls them out as evidence. He does not even understand half of what he is reading by the looks of it as he uses links as examples that prove the opposite of what he is trying to say :p

I can easily find hundreds of links to prove the world is flat, does that mean it is true or that we should take these people serious? Bet he can't answer that. He will ignore it and carry on banging on with his fake news.
 
Soldato
Joined
22 Nov 2006
Posts
23,400
No. You are flat out wrong.

What you are seeing is a slight difference in the default colour setups between AMD and Nvidia. That's it, nothing more, nothing less, you can calibrate both to look exactly the same.

That IS an actual fact, not your makey uppy one based on anecdotal evidence.



If you are going to talk rubbish, then I can easily say it's mostly the AMD fanboys who are keeping this nonsense alive. As it's only AMD fanboys who seem to perceive this image quality difference.

But no it isn't. If you just adjust the vibrate in the Nvidia drivers you get artifacts and you can never quite get it to match AMD's (or Intel's) image. The more you add the worse the colour banding gets. I have tried it...
 
Last edited:
Back
Top Bottom