At least Greg has bothered to try and debunk it rather than aimlessly going in circles speculating probable cause. Personally I don't think there's much worth looking in to here, but there aren't many end users with both GPU. On Sunday I will attempt to capture screenshots at 4K in various games in different scenarios and see if there's anything of note with fidelity.
Uncompressed screen grabs should show any differences if there are any. Image fidelity can be directly manipulated through driver presets. For example texture filtering quality if predetermined at it's default maybe very marginally disimilar to the other vendors. LOD parameters may also be swayed in either direction. I think both parties have learnt from age old mistakes in compromising IQ for performance, and it's certainly a lot easier to pick these things up now than it was 5 or 10 years ago. If it wasn't, then things like overwriting tessfactors and geometry which is a direct result of manipulating source code would be done far more often without user intervention. This is applicable to all aspects of the PC industry, people are not as easily lead.