• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

image quality depends on desired framerate in games. e.g vega 64 vs 1080ti, vega64 will have worse image quality since it will have to run at medium settings to achieve similar framerate as the 1080ti which runs ultra settings.

Not true at all. We arent on about a moving image where frame rate matters. Stay still and look at a frame, doesn't matter if you are hitting 60fps or 144fps, we are talking about image quality/detail of said scene.

And you are kidding when you think Vega 64 is only capable of medium details to match the frame rate of 1080ti at ultra. I run ultra on 1440p panel with Vega64 and hit well over 100fps in probably 90% of the titles I own. My card annihilates any massively overclocked 1080, and is well within 20 frames of my 1080ti when running ultra on both cards for the same game, but seeing as we are talking over 100fps, you would never know unless you get hung up on a fps overlay.

People are massively I'll informed of Vega 64 capabilities in games.

And I'll add another, the frame times on Vega64 are much more consistent and smooth compared to my 1080ti. Although that could be down to the 5960x powering the 1080ti compared to the 8700k in the Vefa64 rig.
 
Last edited:
But subjective views are often wrong, hence cheaop TV are sold with jacked up saturation. Tehy have terrible image quality, but those who don't know better seem to lap it up.

It is why most people freak out when they see a calibrated picture on both TV and Monitor. To them it looks dull and yellow. Most people love a vibrant saturated blue tint image.
 

Everyone is entitled to there opinion but LG B6 has great reviews among tv critics for visual quality and my pc, 4k blueray, ps4 all look amazing on it.

I know when people say Nvidia cards look washed out it's an issue on their end because the issue is related to incorrect black level setting. Whatever, I'm done with this ******* contest forum lol
 
I know when people say Nvidia cards look washed out it's an issue on their end because the issue is related to incorrect black level setting. Whatever, I'm done with this ******* contest forum lol

I thought it was a discussion, nothing more?... :eek:
 
4 months passed and haven't posted the images :(

However, I can only say for WOT as the other games look all the same. WOT new game client designed with Nvidia "optimizations" as per Wargaming statement.

With GTX1080Ti the FPS was 150-160ish with Vega 64 is around 110-120ish. However with the 1080Ti (and the GTX1060 6GB on the laptop) on max out setting there is less foliage on the trees and bushes, while smoke and fire (like the one found in Overlord map) and water and mirror reflections (Paris & Lakevile maps) look dull and uninteresting. Same applies to grass.
With Vega 64 the trees and bushes have more foliage, and especially the latter (bushes) I cannot see and aim like used to behind a single bush. Let alone when double and triple bushes are in line.
Paris you can see the reflections of the Eiffel Tower on the windows also, while on winter maps, there is more snow particles getting lifted by the tank tracks.

In addition, without FPS limiter on the Nvidia drivers, the game is horrible to play on 120hz 2560x1440 monitor, (since Sept 2016 XL2730Z is one of those Freesync monitors Nvidia broke the 144hz refresh rate and sound) as it feels like a very fast slideshow at 160fps (especially obvious on city maps). With the Vega having Freesync (144hz) the game looks smooth even at 95fps (Power save move with 176W cap).

This certainly begs the question - if AMD showed as little detail in all these areas would the performance charts for these cards be flipped around?
 
This certainly begs the question - if AMD showed as little detail in all these areas would the performance charts for these cards be flipped around?

On this specific game, yes. However I prefer visuals than more perf.
110ish FPS is perfect with freesync.
 
I’ve just changed from Nvidia to AMD and haven’t noticed a thing!
It's probably more of a case of "you don't know what you are missing" until you actual lost it.

Going from Nvidia to AMD you probably won't notice the difference, but going back to Nvidia after using AMD you would probably get the subtle (apparent for some) feeling of "something's lacking" that you cannot quite put your finger on it (like there's a thin layer of fog filter that's making the imagine look duller/more flat or on the presentation of the image).

Jumping between different games most people would probably just put it down as what they see is what they get; but on games that people that spend hundreds of hours on (such as mmos), the image quality/colour vibrant differences can be quite noticeable when using GPUs from the two vendors- Guild Wars 2 was such a case for me.
 
Who cares. I dont. Just enjoy what you got. There is always something better than what you have. Its only same as saying the 2080ti is faster than my 1080ti. So ? Am I bothered ?

 
Having run both AMD and NVidia quite a lot, I genuinely can't see a difference. Maybe AMD have better colour settings from the off that suit some people? No idea personally but they both look good to me.
 
It's probably more of a case of "you don't know what you are missing" until you actual lost it.

Going from Nvidia to AMD you probably won't notice the difference, but going back to Nvidia after using AMD you would probably get the subtle (apparent for some) feeling of "something's lacking" that you cannot quite put your finger on it (like there's a thin layer of fog filter that's making the imagine look duller/more flat or on the presentation of the image).

Jumping between different games most people would probably just put it down as what they see is what they get; but on games that people that spend hundreds of hours on (such as mmos), the image quality/colour vibrant differences can be quite noticeable when using GPUs from the two vendors- Guild Wars 2 was such a case for me.
WHat the dickens? LMAO You have run AMD for as long as I have been on this forum but suddenly there is a "subtle but apparent something lacking that you can't quite put your finger on". Are you writing a book? lol Honestly, it is quite sad when placebo rules your life :D
 
Who cares. I dont. Just enjoy what you got. There is always something better than what you have. Its only same as saying the 2080ti is faster than my 1080ti. So ? Am I bothered ?

...this is why we don't really bring this up, as it could potentially open a can of worms and possibly be taken the wrong way...

shankly should have knew better than to post this :eek:
 
Ahh I remember this from the time I had an ATi 9800 pro after switching from an nvidia card.
That was something different.

Back in those days the ATi/Nvidia GPU drivers used to have texture quality settings, IIRC they were "High performance, performance, quality and high quality". In order to try and gain an advantage over ATi Nvidia decided to stealthily alter their slider, they basically added a new extra low option below high performance, then removed high quality and reordered the names. So effectively a high quality texture setting on Nvidia's panel was the same as the quality setting on ATi's, which gave the illusion of Nvidia having more speed/performance but in reality they were sacrificing quality for it.

It all got found out and that's why the Omegadrivers (and others) hit the scene, custom third party drivers offering to fix Nvidia's shadiness and restore full image quality for Geforce/TNT/etc users. Eventually in time Nvidia stopped doing it and reverted to the proper settings.

*EDIT*

LMAO, just checked and the option is still in the Nvidia panel, and with the latest drivers it defaults to "quality" on a GTX1060 and an RTX2080 instead of "high quality". Oh Nvidia, you never change xD
 
WHat the dickens? LMAO You have run AMD for as long as I have been on this forum but suddenly there is a "subtle but apparent something lacking that you can't quite put your finger on". Are you writing a book? lol Honestly, it is quite sad when placebo rules your life :D
I own 8800GTS and 9800GTX+, and have used both 560Ti and HD5850 when I use to play many hours on Guild Wars 2.

I was merely stating what I noticed and experienced, but I won't call it the universal truth for everyone (like some people are more sensitive toward frame rate fluctuation and some don't). At the end of the day, if people are happy with what they got that's all that's matter.

But hey hoo whatever I say it wouldn't matter to you, as your mind is already made up and think I am BSing. Let's face it: I know you think I am just some blind raging AMD fanboi that's anti Nvidia, and I see you in the exact same way with Nvidia, so we probably never going to see eye-to-eye and think there's always an agenda behind each other's posts.
 
I own 8800GTS and 9800GTX+, and have used both 560Ti and HD5850 when I use to play many hours on Guild Wars 2.

I was merely stating what I noticed and experienced, but I won't call it the universal truth for everyone (like some people are more sensitive toward frame rate fluctuation and some don't). At the end of the day, if people are happy with what they got that's all that's matter.

But hey hoo whatever I say it wouldn't matter to you, as your mind is already made up and think I am BSing. Let's face it: I know you think I am just some blind raging AMD fanboi that's anti Nvidia, and I see you in the exact same way with Nvidia, so we probably never going to see eye-to-eye and think there's always an agenda behind each other's posts.
Don't be like that and your post did make me chuckle and my response wasn't personal (it does look like that in fairness) but I honestly don't see a difference personally and my eyes are pretty good for an old git :D Maybe there is a difference and I don't see it? Maybe colours are better out the box from AMD? I will have to pass on that but for me, I wouldn't have a clue what was in a machine from just looking at an image.
 
Back
Top Bottom