• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ASROCK 6900 XT NOW ONLY £689.99 (£649.99 FOR LOYAL FORUM MEMBERS)

I didn't know anything about the myth TBH. It's my own personal experience using different GPUs on the same monitor to play the same game. I can improve the Nvidia image by using the sharpen filter but that comes with a performance penalty. I'm not saying I wouldn't want an Nvidia card ( still have a 3090 for vr) I'm just saying I prefer the AMD image.
Exactly, I work with a few people that have come to the same conclusion having used both manufacturers cards, they prefer the way AMD displays the image. In theory there should be no difference but there is, as you say maybe it's the Nvidia image not being as sharp. They use more compression techniques?
 
Exactly, I work with a few people that have come to the same conclusion having used both manufacturers cards, they prefer the way AMD displays the image. In theory there should be no difference but there is, as you say maybe it's the Nvidia image not being as sharp. They use more compression techniques?
No - that's not how it works. We have display specifications (sRGB is the most common) - GPUs *have* to adhere to these standards otherwise it would be impossible to set up the colour-calibrated workflows necessary for professional work - with the right *monitor* calibration, sRGB will look *exactly* the same on Apple, Windows, Linux, Android etc. It has to, otherwise the whole house of cards falls apart.

Users are free to make whatever changes they desire in either display or monitor settings - but out-of-the-box, sRGB on an AMD card looks identical to sRGB on an nvidia one.
 
Last edited:
Wasn’t that a thing like at least 12 years ago, where ATI was reported to have better image quality than Nvidia, perhaps a hold over from those days when in reality they are both the same more recently due to the standards such a sRGB already mentioned.
 
Last edited:
Having own both my experience was that AMD used slightly more vivd colours. So tou of the box the AMD image looks nicer to some people. Sharpness etc was not an issue.
 
Wasn’t that a thing like at least 12 years ago, where ATI was reported to have better image quality than Nvidia, perhaps a hold over from those days when in reality they are both the same more recently due to the standards such a sRGB already mentioned.
Well, as I said earlier, AMD and nvidia have/had different ways of scaling the boot screens - nvidia uses bilinear filtering which gives a soft look whereas AMD uses nearest neighbour which give a super-sharp pixelated effect - that's just at boot though - once they're in Windows they look the same (incidentally, sRGB's been around since 1995). In my opinion (and I'm an editor/colourist that's used both nvidia and AMD for decades) it's a placebo effect much like the 'perceived' differences between 14,100, 48,000 and 96,000 Hz audio (that's another can of worms opened - here come the pitchforks :D )

*Edit* good thread on Reddit here: https://old.reddit.com/r/Amd/comments/n8uwqk/amd_vs_nvidia_colors_concrete_proof_of_differences/

Tim (the monitors guy) from Hardware Unboxed chimed in with this:

Part of my monitor review workflow involves testing monitors on both Nvidia and AMD GPUs. Two separate test systems, both running default settings in their respective control panels.

Currently the Nvidia system has an RTX 3090 and the AMD system an RX 5700 XT

I've never spotted a difference between them in color reproduction. I've measured it using my tools in the desktop, web browsers, games. Taken side by side photos and captures. Never spotted any differences. They produce identical images.

Because this comes up every so often I did look into it to see if it was worth making a video on but the conclusion was there was no difference so it wasn't worth making a video. Since I can't reproduce it I have to assume it's some sort of configuration issue.

EDIT: Back in the day I used to see this occasionally when Nvidia would accidentally default to the wrong RGB range (limited instead of full) but in this particular case apparently that is not the problem so I don't really know how in this case the difference is happening. And those limited/full range issues were a while ago, would have to be several years now

Tim raises a point I'd forgotten - sometimes (especially when using a hdmi cable to a 1080p monitor) the wrong output range can be set in the nvidia control panel - if you think your colours look dull and muted (it's really obvious - not a subtle thing) it's worth checking to ensure the Output Dynamic Range is set to full:

Untitled.png
 
Last edited:
Finally had time to check it in few games. Great piece of kit, 1440p running at 280W peak at 62*C :p

The only thing I don't like is that I had to add custom resolutions to play CSGO in 4:3
 
Last edited:
I think, it's been awhile since I last did it, you go to AMD rewards. Setup an account and install the little verification app. Runs on your PC and verifies the card and then you can get the games. So I think you have to do it once your setup and running.
you should get an email with your code in it: this is what the email says:-

This promotion requires you to have purchased an eligible AMD Radeon GPU installed before you attempt to validate your coupon code.

To Redeem your Game Key from AMDREWARDS.COM please follow the steps below:
Step 1: Go to www.amdrewards.com
Step 2: Create an Account and Sign-in
Step 3: Enter your coupon code. Then follow the steps-by-step process to redeem your reward(s).
Step 4: Check the MY REWARDS page to view or activate your Game Keys.


**** VERY IMPORTANT *****

The AMD Rewards Coupon Code must be redeemed by 2023-03-4 11:59:59 PM (EST)

Appreciate the time. Will keep you posted with when it's built. Hoping for the end of next week.
 
Went to redeem my coupon code but it states my code has reached it's limit??
I put in a support ticket to AMD.
 
Well, as I said earlier, AMD and nvidia have/had different ways of scaling the boot screens - nvidia uses bilinear filtering which gives a soft look whereas AMD uses nearest neighbour which give a super-sharp pixelated effect - that's just at boot though - once they're in Windows they look the same (incidentally, sRGB's been around since 1995). In my opinion (and I'm an editor/colourist that's used both nvidia and AMD for decades) it's a placebo effect much like the 'perceived' differences between 14,100, 48,000 and 96,000 Hz audio (that's another can of worms opened - here come the pitchforks :D )
I think you guys need to clarify what you mean by image quality. If talking about a 2D Windows environment I would agree there should be no difference between vendors, unlike the old days with analogue VGA cables, where the RAMDAC on the videocard could make a difference. However in 3D games there may well be differences due to the defualt choices the drivers make for things like sharpening, filtering and LOD. However I see these as minor issues which have been greatly reduced compared to 10-20 years ago, when there were a lot more "hacks" in 3D graphics, such as how vendors implemented anisotropic filtering.
 
Also I am unsure of how true it is but the latest WHQL driver is the driver to use for stability, it seems folks who go to the hotfixes get instability.
I may be wrong, but I have a gut feeling I am right too, but there could be issues elsewhere with your hardware if you experience instability.
 
Last edited:
On the topic of the 6900 XT having poop RT performance, I ran Forza 5 on 1440P native extreme with extreme RT and was always above 100 FPS.
With FSR 2.2 I was in the 130+ FPS range.

I know it is just 1 game but still...
 
Exactly, I work with a few people that have come to the same conclusion having used both manufacturers cards, they prefer the way AMD displays the image. In theory there should be no difference but there is, as you say maybe it's the Nvidia image not being as sharp. They use more compression techniques?
Wasn’t that a thing like at least 12 years ago, where ATI was reported to have better image quality than Nvidia, perhaps a hold over from those days when in reality they are both the same more recently due to the standards such a sRGB already mentioned.

The difference is that AMD defaults to a more saturated image. IF you compare the default AMD with the default Nvidia image side by side, the Nvidia image will appear "flat" in comparison. Which is why people say that AMD is Better. It's not better, you can get the same effect on Nvidia cards by increasing the digital vibrance.
 
My driver defaulted to 4:4:4 RGB and 8BPC, I don't see a single difference in image quality between the 3070 and the 6900 XT, just more oooomph.
 
On the topic of the 6900 XT having poop RT performance, I ran Forza 5 on 1440P native extreme with extreme RT and was always above 100 FPS.
With FSR 2.2 I was in the 130+ FPS range.

I know it is just 1 game but still...
I think in games which are develped with the console in mind, which have similar ray tracing abilites, these cards will be fine. For example Far Cry 6, where I guess the limiting factor is the rasterisation performance, the RX 6900 XT out performs the comparable RTX cards with Ray Tracing enabled. Even with games which are heavily laiden with RT effect, should be able to scaled back to run at an acceptable level, especially with FSR and/ or running at lower resolutions than 4K. That was the deciding factor that made me order one of these. :cool:
 
On the topic of the 6900 XT having poop RT performance, I ran Forza 5 on 1440P native extreme with extreme RT and was always above 100 FPS.
With FSR 2.2 I was in the 130+ FPS range.

I know it is just 1 game but still...

That particular game uses next to no RT. I think it only uses RT in the Garage and vista modes, not in actual racing. There is a mod to enable it while racing.
 
Back
Top Bottom