• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

But no it isn't. If you just adjust the vibrate in the Nvidia drivers you get artifacts. The more you add the worse the colour banding gets. I have tried it...
I have digital vibrance set to 63 and hue set to +2 and that is the sweetspot for me on my monitor. Other monitors will require different settings and pretty much every user has a different liking.
 
But no it isn't. If you just adjust the vibrate in the Nvidia drivers you get artifacts and you can never quite get it to match AMD's image. The more you add the worse the colour banding gets. I have tried it...
Are you saying there is no chance you was getting something wrong?
 
Are you saying there is no chance you was getting something wrong?

But I'm not. I have both brands of card to test. Do you?

AMD gives a slightly better quality image (as does Intel, it's nvidia who are the odd ones) and has done for a long time now. It's pretty simple.
 
But I'm not. I have both brands of card to test. Do you?

AMD gives a slightly better quality image (as does Intel) and has done for a long time now. It's pretty simple.
Probably better selling your NVidia card by the sounds of it. Clearly not working for you!
 
Probably better selling your NVidia card by the sounds of it. Clearly not working for you!

Well, I had 3 in a row fail in under 2 years which is why I switched. So there is clearly something else quality related going on. I hear 20 series owners didn't have much luck either ;)
 
Well, I had 3 in a row fail in under 2 years which is why I switched. So there is clearly something else quality related going on. I hear 20 series owners didn't have much luck either ;)
I remember you saying now. Surprised you bought another NVidia GPU in truth. You should stick with AMD, as that works well for you. I go with what works best for me personally and had failings from both over the years (Titan/560Ti/HD480/HD6950). By the sounds of it, failings and lack of IQ for you make NVidia a poor choice!

What NVidia GPU do you own?
 
But I'm not. I have both brands of card to test. Do you?
Yes. Have had both many many times as I always switch between them.

The fact that you are not allowing for the possibility that there is a even a small chance you could be wrong actually suggests bias to me and that you just want to see what you want to see. Won't go into any more detail, as you, like 4K8K will not believe a word I say unless it fits with your current beliefs. So will leave it at that.

Well, I had 3 in a row fail in under 2 years which is why I switched. So there is clearly something else quality related going on. I hear 20 series owners didn't have much luck either ;)
But I have in the past 2 decades one from each camp fail on me, what does that mean?

Too much bias man.

Gregster is known for having bias towards nvidia but even he does not go this far and I have seen him hold his hands up on more than one occasion when he gets it wrong rather than doing a 4K8K and carry on and embarrassing himself. We all get **** wrong, but I feel the better man holds his hands up and learns from it rather than bury his head in the sand and continue.
 
I give up. What a waste of typing and I always enjoy a good discussion but it is pointless.

Why? I can guarantee you that this is exactly the same texture difference between a GeForce and Radeon that historically I have observed in any game.
All settings are equal - texture quality is set to very high.
 
Why? I can guarantee you that this is exactly the same texture difference between a GeForce and Radeon that historically I have observed in any game.
All settings are equal - texture quality is set to very high.
Prove it. Where is your evidence?? Want us to take your word for it? Your word is worth nothing due to all your fake news the past few months man!
 
so if nvidia need a little more software fiddling to get their image quality to match AMD out of the box then surely thats not a stock for stock comparison and im pretty sure when people do benchmarks they only fiddle with overclocks and power settings not image quality/colour settings. do these setting impact performance at all? if so its sort of a cheat on nvidias side is it not? yes we all know they are so far ahead they dont need to cheat but...
 
Depends, if they are compressing things more than yea you'll get more FPS. But the more you compress the more quality you lose. Which is why jpg files don't look as good as bmp.
 
I still haven't seen evidence of a single AMD image quality expert here going to one of the tech site and youtubers to ask them to do a video on this supposed nvidia cheating.

What are you trolls afraid of?
 

Oh really did you actually read that, did you notice that the conclusion form that initial post is ....


What that all means is the conclusion is pretty clear: There are many good reasons to buy team green or team red, but 2D image quality is not one of them. They are precisely the same.

Not exactly making your point is it....;)
 
That is from your link, so clearly you are stating that there is no difference in IQ between AMD or NVidia. Thanks for clearing that up.


Oh blast, that will teach me for reading and posting before reading further......lol
 
But no it isn't. If you just adjust the vibrate in the Nvidia drivers you get artifacts and you can never quite get it to match AMD's (or Intel's) image. The more you add the worse the colour banding gets. I have tried it...

No, you are flat out wrong again.

You can easily match Nvidia's colours to AMD's or the other way around with a little calibration. And I suggest that if you have tried it and couldn't get it looking the same then you are doing something wrong.
 
But I'm not. I have both brands of card to test. Do you?

AMD gives a slightly better quality image (as does Intel, it's nvidia who are the odd ones) and has done for a long time now. It's pretty simple.

If it's so simple why do you continue to get it wrong? There is no difference in image quality. The only differences are the default colour schemes.
 
Depends, if they are compressing things more than yea you'll get more FPS. But the more you compress the more quality you lose. Which is why jpg files don't look as good as bmp.

What are you talking about? You sound like 4K8K now.

The reason why this isn't been reported in tech news is because it's not an issue. The last time Nvidia and AMD were caught cheating with image quality to boost framerates they were caught and it was big news. Now you are trying to say that there is still an Image quality difference that Nvidia is using to increase framerates but nobody from any of the tech sites has noticed it. Rubbish.
 
so if nvidia need a little more software fiddling to get their image quality to match AMD out of the box then surely thats not a stock for stock comparison and im pretty sure when people do benchmarks they only fiddle with overclocks and power settings not image quality/colour settings. do these setting impact performance at all? if so its sort of a cheat on nvidias side is it not? yes we all know they are so far ahead they dont need to cheat but...

Changing colour settings has no impact on performance. A lot of people use ICC colour profiles and calibrate their monitors to get the best picture quality for whatever they are doing. Changing the colours is a personal preference. Nvidia picked a certain colour scheme, what they think look best, AMD picked their own default. They don't look the same. AMD's default colour scheme is slightly darker which makes the colours pop out a bit more, some people prefer that.

Benchmarking at stock clocks is a different matter entirely.
 
so if nvidia need a little more software fiddling to get their image quality to match AMD out of the box then surely thats not a stock for stock comparison and im pretty sure when people do benchmarks they only fiddle with overclocks and power settings not image quality/colour settings. do these setting impact performance at all? if so its sort of a cheat on nvidias side is it not? yes we all know they are so far ahead they dont need to cheat but...

Going from Nvidia card to Vega 64, I noticed image quality difference straight away, especially on F1 2018. The image just looks sharper and crisper.

It's possible that nvidia makes this difference intentional as some sort of weird product differentiation strategy. As reported, with a GeForce you are very likely to observe the following effects:

1. Textures look lower resolution because of applied texture compression;
2. Some parts of trees and bushes like branches and foliage are cut off the scene;
3. Distant objects look washed out, loss of detail in distance
4. Colours look washed out, applied very thin white colour filter on the images;
5. Fonts look with irregular shape, sharp curves instead of smooth curves, thicker letters, sometimes lighter black on the fonts
6. Contrast balance is broken
 
Back
Top Bottom