• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Image Quality AMD 6800XT vs nVidia 3080? Apples to Apples? Errrrrr

Soldato
Joined
22 Apr 2008
Posts
3,932
Location
Bryn Celyn Wales
Just been watching this video and to me, there seems to be a a clear difference in all of the videos Bryan compares... and yes I do realise there there are slight differences in position which could actually cater for these differences... all I'm doing here is highlighting what I can see from his video... and then it made me think now, I wish in reviews we had a direct picture to picture review, not just FPS... we need video quality as well reviewed!!! No point one game getting 200fps if the picture isn't as detailed as the one getting 150fps?

Or do you guys just think that due to the slight differences in where the images are etc that there is NO DIFFERENCE in PQ between any of these cards? Could it be youtube... could it be a combination of all of these... Just thought it might be an interesting discussion....

THE QUESTION IS, DO WE THINK WE'RE LOOKING AT EXACTLY THE SAME IMAGE WHEN WE'RE BENCHMARKING WHETHER YOU THINK THE BELOW SCREEN SHOTS ARE VALID OR NOT?

That's that the discussion here is really. Do AMD and nVidia really have an apples to apples render and output? if not then basically benchmarks are an indicator only and we also need analysis on the picture quality as well or do people simply just want fps nowadays which does seem to be the measure of a card and quality seems to come second now?


So, for example, I'd say that Shadow of the tomb raider looks brighter on the 3080 series rather than the 6800XT.. more punchy somehow.. the whites are brighter etc... and I'd say the two areas I've pointed at look slightly more defined on the 3080?... which could come down to encoders.... who knows... This raises the question... assuming that the encoder's Bryan is using here are exactly the same... is this saying that all of a sudden now we're not comparing apples to apples between the two cards .. BUT in Shadow I think that the 3080 has the better picture? it's slightly brighter, and I think that there is more detail in the picture? Watch the video in 4k and see what you think?

Dirt 5 looks like a different game between the two... it's like it's not even the same game at some points... all the shadows are different, the definition on the cards is different... suppose there is RT at hand in this now... but look at this screen shot... I suppose the problem with this one is that we've not got 3 tech's all be employed in this title now... but I'd say in the screenie below the 6800 is the worst and the 5700XT is the cleanest? Also does this now invalidate the benchmark if it's showing different things? I'd say it does...

Bryan4.JPG




Bryan1.JPG



Obviously tons of discussion could ensue here and I'm not causing a war between the two as I think that they both are amazing cards BUT.... I'm not saying one card is better than the other here, however with all the tech these cards have to be seeing differences as blatent as this... when benchmarking now, I'm not convinced we're seeing a fair comparision sometimes? and I've just looked closely at Horizon Zero Dawn and here are my two example where I feel that the 3080 just isn't as defined as the 6800xt... ??? I see with my eye more detail in the 6800XT screen? Howerver the position of the screen is ever so slightly difference, coudl this cause this in the first picture below? Same with the second picture it's slightly out of sync... so could this be the explanation?

Bryan.JPG


Bryan2.JPG


So listen, have a watch of the video see what you guys think and maybe do some checks yourself... anyhow thought I'd raise this again... and have a discussion on it...
 
Last edited:
You do realise that video is showing the difference in the cards built in streaming decoders and has nothing to do with what you will see on your own screen playing the game ?
 
You do realise that video is showing Yes I dthe difference in the cards built in streaming decoders and has nothing to do with what you will see on your own screen playing the game ?
Yes I do which is what I said in the original post that could be one of the defining reasons. However, like I also said, more improtantly, whether we SHOULD be including PQ in benchmarks so we ARE comparing apples to apples.

These were just examples. DIRT5 isn't, that is a visible difference in the benchmark on AMD and nVidia cards from how I'm reading this review which he seemed to highlight when he catpured the game... i.e. yes using the decoder there is less detail however he states that the benchmark is showing differently hence he's removing it from his benchmarks... Also, he states he's using a Corsair capture card for the SOTTR demo and the other stuff now to ensure there are no differences... but I thought I could see differences hence this thread, so if this is the case would they not be using the same encoder? i.e. not an internal specific to either AMD or nVidia? Maybe I've got the wrong end of the stick, but to me, if he's using a seperate decoder that won't effect the captures?

At 2:40 where he explains how he is capturing there should be NO DIFFERENCE in what the end user sees??? Am I missing something? All I'm saying is based on my understanding there, I can see an image difference which could be for multiple reasons... however just asking the query on whether we do think there is a difference? We've known in the old days that AMD and nVidia did render things differently... remember in 2008 and the nVidia fiasco in 3DMark? Not saying that's actually purposly happening now in any shape or form, however there could be "optimisations" where one display is infact suprior or interior to the other... and I think we're now with the tech getting to a point where this isn't just fps v's fps... we need to look at image quality nowadays!
 
Last edited:
The only difference i can see is in dirt 5 which is a brand new amd sponsored game, I'm pretty sure nvidia will fix the wonky shadows in their next driver release and then the games will look the same.
 
Watching the video on the very first couple of minutes he does mention the encoder on the 6800 XT does poorly even compared to the previous 5700 XT, and Dirt 5 was used to show this. LTT mentioned this too that the 6000 series encoder is just completely pants.

However for the rest of the video he does use a capture card on a different PC, using a CPU encoder. Everything from that point onwards is pretty much the same to the eyes, and when he does return to Dirt 5 he says he plans to remove the game as it's too inconsistent between runs even on the same GPU to properly use for comparison.
 
Last edited:
Back
Top Bottom