• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Yeah, on reflection I think I'm going to go with Hub's conclusions over yours...

:D
You can go with whoevers conclusions you like,, but the numbers are not lying. It is a fact that a 3080 @ 1440 or 4k HIGH almost matches (we are talking 3 fps here) a 6800xt at 1440p or 4k MEDIUM. Of course you are going to deflect the point cause you don't really care about the facts but about proving your preconceived notion, but the numbers are out there man. You cannot disagree with facts im afraid.

If you want to take TLOU as the end all be all of GPU comparisons, the 3080 user will have a better time than the 6800xt user. Fact.
 
What difference does it make whether amd was already on 16? I care about the results, not the spec sheets. A 3080 @ 1440p high on TLOU1 matches a 6800xt at 1440p medium. Those are the facts. So it aged much better while still having substantially better RT performance and a better upscaler

I originally wanted a 6800xt as preferred amd back then (had amd gpus since 3870) and also was slightly cheaper (RT and dlss wasn't as prevalent back then so wasn't a big factor at the time) but in the end, I'm so glad AMD didn't offer them for MSRP in UK and forced me to go with the 3080 instead due to how much better the card has aged "overall" especially around RT and DLSS compared to rdna 2, more so DLSS, FSR is still an inferior option in every way possible, it's crazy to think that amd still haven't caught up in this after all this time, although tbf, their solution is good but as per usual for them, the way they have handled it i.e. over the fence, do as you please hasn't helped. That and nvidia gpus just work better with the gsync module and HDR. Literally the only reason(s) to have gone with RDNA 2 is if you never wanted to use RT or/and only played COD as your main game or/and somehow managed to get it for cheaper than nvidia competing gpus.
 
I originally wanted a 6800xt as preferred amd back then (had amd gpus since 3870) and also was slightly cheaper (RT and dlss wasn't as prevalent back then so wasn't a big factor at the time) but in the end, I'm so glad AMD didn't offer them for MSRP in UK and forced me to go with the 3080 instead due to how much better the card has aged "overall" especially around RT and DLSS compared to rdna 2, more so DLSS, FSR is still an inferior option in every way possible, it's crazy to think that amd still haven't caught up in this after all this time, although tbf, their solution is good but as per usual for them, the way they have handled it i.e. over the fence, do as you please hasn't helped. That and nvidia gpus just work better with the gsync module and HDR. Literally the only reason(s) to have gone with RDNA 2 is if you never wanted to use RT or/and only played COD as your main game or/and somehow managed to get it for cheaper than nvidia competing gpus.
Im very biased against FSR cause I hate sharpening with a passion, makes me dizzy / throw up after using it for 10 minutes, so I can't really use my personally opinion on the matter. But even among reviewers, it's prevalent that DLSS > FSR, so it's not just me.
 
What I find hilarious though, is that when I said that amd CPUS (especially those 6 core zen 4 parts) struggle in TLOU compared to 2 year old i5s, the answer I got was "this game is buggy and shouldn't be used as a comparison". But here we are - on the nvidia thread, basically sh**** on nvidia cards - despite the fact that they offer a better experience to their amd counterparts. That is just nuts..
 
Im very biased against FSR cause I hate sharpening with a passion, makes me dizzy / throw up after using it for 10 minutes, so I can't really use my personally opinion on the matter. But even among reviewers, it's prevalent that DLSS > FSR, so it's not just me.

100%

Largely why I never use FSR even if I need the extra performance (where dlss hasn't been included as amd have blocked it) as not only does a sharpened image look awful, it also increases the visibility of all the temporal artifacts.

Seems AMD fans like the over sharpening effect though as some even whack on the radeon image sharpening to 80+% in the control panel :o
 
100%

Largely why I never use FSR even if I need the extra performance (where dlss hasn't been included as amd have blocked it) as not only does a sharpened image look awful, it also increases the visibility of all the temporal artifacts.

Seems AMD fans like the over sharpening effect though as some even whack on the radeon image sharpening to 80+% in the control panel :o
Yeah, ive seen people do that with RIS. But again, that's an issue of personal preference so I don't want to call it better or worse. It's not even that I dislike sharpening as such, it just makes me nauseous.
 
LMAO, that's the best you can do? primary school level...

:cry:

brexit-sinking.gif
 
You can go with whoevers conclusions you like,, but the numbers are not lying. It is a fact that a 3080 @ 1440 or 4k HIGH almost matches (we are talking 3 fps here) a 6800xt at 1440p or 4k MEDIUM. Of course you are going to deflect the point cause you don't really care about the facts but about proving your preconceived notion, but the numbers are out there man. You cannot disagree with facts im afraid.

If you want to take TLOU as the end all be all of GPU comparisons, the 3080 user will have a better time than the 6800xt user. Fact.

Yeah, when faced with a professional world renowned tech site with millions of views a year(as well as years of evidence regarding VRAM) , or a serial coper with obvious biases, I think I know whose opinion I'll trust thanks!

:cry:
 
Last edited:
What I find hilarious though, is that when I said that amd CPUS (especially those 6 core zen 4 parts) struggle in TLOU compared to 2 year old i5s, the answer I got was "this game is buggy and shouldn't be used as a comparison". But here we are - on the nvidia thread, basically sh**** on nvidia cards - despite the fact that they offer a better experience to their amd counterparts. That is just nuts..

Standard for here, remember amd are the good guys and white knight for pc gaming scene....

Sure look at intel and cpu security flaws, remember when intel had a ton, that thread exploded and there would be a new thread for every security flaw found, then recently when amd had a **** ton discovered, no mention of it on here so created a thread and hardly any activity and the activity that there was, apparently a "non issue".... :cry:

Personally when/if I upgrade cpu, mobo, I'm going back to intel, they just have less issues and provide a better overall experience, don't get me wrong, 5600x and especially the 5800x3d are great but that's when there aren't issues with cpu cores not being properly utilised, sadly too many games recently where my 5800x3d isn't getting properly utilised e.g. spiderman

Yeah, ive seen people do that with RIS. But again, that's an issue of personal preference so I don't want to call it better or worse. It's not even that I dislike sharpening as such, it just makes me nauseous.

Oh yeah it's completely subjective at the end of the day but I just find it amusing, those people go on about ghosting/blur with dlss/fsr yet they don't notice the shimmering, aliasing etc. issues, which are vastly increased when sharpening is applied and imo is far more distracting/noticeable in motion.

Reminds me of another example of "it's cool to **** on nvidia", dlss, when it had/has ever so slight amount of blur/ghosting but no one bats an eyelid when the default/native TAA has even worse ghosting or better yet FSR having worse ghosting than dlss in the majority of games :p

@Darujhistan

You never did answer this....

Proven correct in what? That broken/unoptimised games are causing problems on pcs of all specs? Only reason gpus with 16+GB don't experience the issues to the same extent such as in TLOU is purely because they are able to avoid the actual core issue/fault here and it all comes down to consoles having a very different hardware setup to pc now thus if pc version doesn't get the right treatment, of course we're going to see issues. Unless of course again, you think that these games (hogwarts, forbidden and TLOU) are perfectly well optimised for PC and there is nothing to fix/improve?
 
You can go with whoevers conclusions you like,, but the numbers are not lying. It is a fact that a 3080 @ 1440 or 4k HIGH almost matches (we are talking 3 fps here) a 6800xt at 1440p or 4k MEDIUM. Of course you are going to deflect the point cause you don't really care about the facts but about proving your preconceived notion, but the numbers are out there man. You cannot disagree with facts im afraid.

If you want to take TLOU as the end all be all of GPU comparisons, the 3080 user will have a better time than the 6800xt user. Fact.

The 3080 is better than the 6800xt at lower presets but drops below at Ultra settings. Why would that be?
 
Yeah, on reflection I think I'm going to go with Hub's conclusions over yours...

:D
You might want to revise that line of thinking as your 6700XT can just barely maintain 60fps average and it dips below 60fps on 1080p @ ultra settings in the last of us.

So by yours and your friends standards the 6700XT is now or will be very soon be obsolete, as you said its a lie that a card will run out of grunt before vram becomes an issue so maybe it needs more VRAM.
 
Yeah, when faced with a professional world renowned tech site with millions of views a year(as well as years of evidence regarding VRAM) , or a serial coper with obvious biases, I think I know whose opinion I'll trust thanks!

:cry:
You can trust whoever you want, but the numbers do not change. The 3080 does in fact perform similar at high settings to the 6800xt at medium settings. That is a FACT. PERIOD. Your opinion or anyone elses is completely and utterly irrelevant. The fact that you are not addressing the actual point but instead make arguments from authority absolutely proves you know you are wrong, you just don't want to admit it.

It's also utterly hilarious that you claim hwunboxed are the experts, but then you disagree with me when I say dlss > FSR. You know - that's also their opinion right? But I guess they are experts only when it suits you :D What a biased individual
 
Last edited:
You can trust whoever you want, but the numbers do not change. The 3080 does in fact perform similar at high settings to the 6800xt at medium settings. That is a FACT. PERIOD. Your opinion or anyone elses is completely and utterly irrelevant. The fact that you are not addressing the actual point but instead make arguments from authority absolutely proves you know you are wrong, you just don't want to admit it.

It's also utterly hilarious that you claim hwunboxed are the experts, but then you disagree with me when I say dlss > FSR. You know - that's also their opinion right? But I guess they are experts only when it suits you :D What a biased individual


The 3080 is better than the 6800xt at lower presets but drops below at Ultra settings. Why would that be?

:cry:
 
The 3080 is better than the 6800xt at lower presets but drops below at Ultra settings. Why would that be?

:cry:
Because it runs out of vram. Why does it matter? The same way the 6800xt runs out of grunt (but I guess thats fine cause AMD) so it runs worse on lower presets, you know - presets you would actually use on 3 year old cards. Why don't you complain about that?

I mean you haven't discovered America here, I've said it multiple times in the past, on settings that you won't actually play cause your framerate will be low (ultra settings 1440p have both cards dropping to even below 60 fps) the 6800xt IS better. So sure, if you enjoy 55 to 65 fps, and don't like RT and have no use for DLSS, 6800xt > 3080. I don't like playing at 55 to 65 fps, so the 3080 at HIGH settings gets the same performance as the 6800xt at MEDIUM. Therefore it offers a better experience for my standards. Go dodge the point again so you don't admit that this is nvidia fine wine at it's greatest moments. It matches it's competition at higher quality presets, FACT
 
Last edited:
Let me put it in pictures, below is cyberpunk WITHOUT RT in 1080p and 4k respectively


At 1080pp the 6800XT has better lows and similar averages


1080p.jpg




Then at 4k the 6800xt......runs out of VRAM and we have a whoopin 20% swing of 1% lows. From being 7% ahead at 1080 to being 13% behind in 4k. Maybe AMD should have put 24gb of vram on their 6800xt, right? But this is fine, it's an amd card after all :D

4k.jpg
 
Last edited:
The 3080 is better than the 6800xt at lower presets but drops below at Ultra settings. Why would that be?

I haven't looked too much into the 3080/10gb for tlou (playing very well for me personally since the last patch and I stopped switching settings during mid game), from a quick look, seems that the equivalent gpus in terms of grunt but with more vram are also having to drop settings or/and use a higher preset of dlss/fsr to hit the FPS target I would want:

YkQTCHH.png


0jb42FW.png


KuJUUDV.png





But question for you and @Darujhistan, why is the game crashing on all pcs and performing relatively poorly regardless of vram/spec of pcs? And why is it that the 8gb gpus are ******** the bed at even low resolutions such as at 1080P? Surely 8gb should be enough for this res? Could it just be that the game has not been properly optimised for PC configurations or/and there are bugs/issues that need fixing by the game developers?

Here's a hint or several:

iy0PprO.png


pViNdsj.png


5ZdmTjS.png



sC976yy.png



OymjH90.png


From TPU:

The AMD Radeon driver also crashed several times during testing, and I've seen numerous reports of DLSS causing additional crashes, when enabled. The developers have already pushed out several game updates, and it seems that they are serious about fixing these problems quickly. Let's hope they can resolve them soon, so that people can enjoy this masterpiece of a game.

But not the game at fault right? Just need to drop another ££££ to get a slightly less problematic experience, right?

:cry:
 
Last edited:
So in other words The last of us will be yet another game that comes out bad and will get patched and the performance will be more resonable yet wont be acknowledged by the people that troll these threads. See hogwarts getting patched and the 3080 beating some of AMD's best cards.

The last of us will have to be fixed as AMD are giving the game away free with every card from the 6500XT 4GB( :p) and up as it looks pretty poor at the moment giving a game away for free that runs so bad some of these cards.
 
Last edited:
Back
Top Bottom