• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?

Was just having a quick look at that "adjust image with preview" control panel page, which I normally don't touch. One thing to keep in mind, on the advanced page, I had nothing set to forced, other than what was there by default.

It was on "let the application decide", so I selected "Use the advanced 3D image settings" option and hit apply.

When I went to the advanced settings page after doing this, AF was now set to 8X, and AA was set to 4X. These were absolutely not forced before. So I am wondering if by default, that "let application decide" setting is forcging 8X AF and 4X AA?

I set them for application controlled, applied.

Back on the "image setting with preview" I set it back to let application decide, looked at the advanced settings, and it had not changed it back to 8X and 4X.

Perhaps it's some first time "until you have actively done xyz" bug? I'm not spending my day uninstalling and reinstalling drivers to test, so maybe someone else wants to have a look.
 
Spent plenty of time on both vendors.

Only times I noticed nvidia being worse for colours was when they were not outputting full rub over hdmi and displayport. There was a hack around for a long time to fix that, but nvidia finally pulled their finger out not too long ago, and we no longer need that hack.

But I agree with those above, that just because an image has more 'pop', doesn't make it better, just like those settings on TVs that throw more saturation and contrast.

As for the issue being discussed in here, don't have any amd cards laying around any more to check against, but nothing has jumped out at me. But then, I'm not all that observant.
I use to play Guild Wars 2 on PC with AMD card and Nvidia card, to me I always felt that on the AMD it looked better. But after getting a bit more into audio side of things thesedays, I come to the understanding for headphones,speakers and amps, there's no simple which is "better", as it can be "warm" sounding and "bright" sounding, and everyone would probably have different preference.

So perhaps to put it into the context of graphic, I prefer the image of AMD over Nvidia, but it could well be just like some people prefer their music to sound "brighter" instead of being more toward the "warmer" side. Not sure if this make any sense.

Though vibrant of colour is one thing, but details of texture is another, and shouldn't be too far off regarding of the vendor.
 
After ~20 months of owning AMD I'm still not that clued up on CCC, but I think...
"standard" texture filtering quality
and
"on" surface format optimisation

applies AMD's game optimisations.

Now what I still don't know is whether "system settings" will apply those settings globally, or whether CCC looks for a custom profile (having located game exe) to not apply their optimisations.

I'm still clueless about the tessellation options (mainly as I have no idea if pCARS uses it), what the info means when you have the mouse cursor over "morphological filtering," the EQAA modes etc.

Surface format optimization allows the driver to swap color spaces for performance. nVidia does the same thing. XBox does the same thing (HDR format).

On GCN cards the performance difference is almost exactly 0%. Just tested it out benching Hitman:Absolution. With it on I got 44/52/103, with it off I got 44/52/96. Min/Avg/Max. You can pretty much ignore the max, I could run it 1,000 times and it will always come up with something different since it counts the period of time during the fadeout where rates sometimes skyrocket.

There should be little to no visible image quality difference as our monitors can usually only show 6-bits per color channel and AMD's optimization still performs at 11-bits per channel (except blue, which uses 10-bits per channel).
 
Was just having a quick look at that "adjust image with preview" control panel page, which I normally don't touch. One thing to keep in mind, on the advanced page, I had nothing set to forced, other than what was there by default.

It was on "let the application decide", so I selected "Use the advanced 3D image settings" option and hit apply.

When I went to the advanced settings page after doing this, AF was now set to 8X, and AA was set to 4X. These were absolutely not forced before. So I am wondering if by default, that "let application decide" setting is forcging 8X AF and 4X AA?

I set them for application controlled, applied.

Back on the "image setting with preview" I set it back to let application decide, looked at the advanced settings, and it had not changed it back to 8X and 4X.

Perhaps it's some first time "until you have actively done xyz" bug? I'm not spending my day uninstalling and reinstalling drivers to test, so maybe someone else wants to have a look.

Despite saying wouldn't be reinstalling... Anyway, looks like it may have been a one off? Clean reinstall, went through same steps above, but af was not set to 8x and aa was not set to 4x in advanced.

Oh well.
 
Gregster, did you restart the game after applying the settings? I went to do the same thing..

before restarting game:

kBoZz2s.png
After restarting game:

f4fsUTy.png
EDIT: This is with default settings in AMD CCC.

Anyway, the reason I took these screenshots was to compare contrast etc. and neither gregster's nor mine look any different to my eyes on my 29um65.
 
Gregster, did you restart the game after applying the settings? I went to do the same thing..

before restarting game:

kBoZz2s.png
After restarting game:

f4fsUTy.png
EDIT: This is with default settings in AMD CCC.

Anyway, the reason I took these screenshots was to compare contrast etc. and neither gregster's nor mine look any different to my eyes on my 29um65.

I don't think I did in the first video I did but can't honestly remember. I deffo did for the NCP comparison vid, as I can remember recording and stopping the game to go and make a cuppa (and I switched the NCP to Prefer Max Performance)
 
wow my bad i was looking at the shadows haha
yeh the top one has lots less detail on the paint and/or blurs out a lot sooner
right?
*shrug*
 
The exact same areas where Greg's screenshot looked like it was lacking detail and sharpness, I have highlighted the main differences:

R8stZgo.png

looks like a draw distance thing to me
thats this noobs opinion lol

not sure what its trying to prove tho
 
Yes the top one has considerably less detail and sharpness/clarity.

It isn't just the draw distance as even the black tiles on the right aren't anywhere as detailed as the bottom screenshot.

Well the point is to show that it must just be down to the game and greg perhaps not restarting it after applying the settings on the TX :p However, he says he did restart the game for the NVCP settings comparison so who knows...

Only way to know for sure is for greg (or anyone else with a nvidia GPU) to take screenshots in a variety of games comparing the NVCP settings, like I said earlier on, I think it is just down to user error (not intentionally) or/and just a bf 4 issue.
 
Last edited:
Like it has been said if you going to do a lot of changing cards from AMD to Nvidia wouldn't it be worth it to buy 2 120/250 Ssd's and put the same OS on them and just change what graphic drivers you put on them,1 with AMD drivers and 1 with Nvidia drivers you could change over Sdd's when you change cards?.
 
Like it has been said if you going to do a lot of changing cards from AMD to Nvidia wouldn't it be worth it to buy 2 120/250 Ssd's and put the same OS on them and just change what graphic drivers you put on them,1 with AMD drivers and 1 with Nvidia drivers you could change over Sdd's when you change cards?.

Feel free to buy me those :D
 
Back
Top Bottom