• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Are RX 6800 XT not wanted?

Looks at 4k results and stand by my statement.

I didn't say anything about a 3090. I was on about a 3080 @ 4k - keeps up and sometimes surpasses a 6900XT....like I said.

Aren't they all AMD favored games you cherry picked?

So you didnt watch any review and ignore every single of the outcomes, expressing that your own opinion is correct. Ok then. LTT, GN, JayZ and HUB (along with pretty much all the others expcet the bought and paid for techtubers) say the same - it depends on teh game to which highest end card wins as both the 3090 and 6900XT go blow for blow.
 
Not to mention when you add dlss into the mix.....

FSR is great with the 2 max settings at 4K but sadly, it's not getting into the games that 1. people care about 2. games where it is really needed (not amds fault though if said games are nvidia sponsored....) 3. it requires the base games AA/IQ to be very good otherwise it will just enhance poorly implemented AA artifacting/issues

Aside the fact of FSR being still very young on the market (and soon the Intel's solution will have same problem) and that it took NVIDIA 3? years to get to the place they are now - it's still very subjective. Aside Cyberpunk 2077 (which I just NOW started to play, as just now it feels playable), I have 0 games that need DLSS on my 6800XT to actually run properly in 1440p ultrawide (so not far from 4k). CP could really use FSR too (CAS is nice but not the same quality for sure) - though going down from Ultra to Very High allowed it to be always above 60FPS, with minimal visual fidelity loss (as in, I can't see a difference without magnifying glass on stop frames).

People tend to forget that "Ultra" settings in games are added "for the future" and not for current gens - that's not new, it's been like that for many years now. As in, it kills FPS with hardly any visual difference, in many games, just so one could go back in few years on new GPU and say "Nice, I can finally run it on Ultra!" and then stop playing after 2 minutes, as even said Ultra usually looks nothing like "very high" of modern AAA games from that time. :)

EDIT: RT is a different matter but I simply don't care in the current state of it - maybe in 10-20 years, when it matures, games stop resembling highly polished mirrors and GPU performance will catch up with requirements for it.
 
I have a 3080 but if I hadn't I wouldn't even consider a 6800XT. When dropping this kind of money on a GPU I want it to do more than just gaming. Nvidia has lots of features that AMD doesn't and being marginally faster in non raytraced games is not going to make up for that.
 
DLSS is the same (the DXR enabled list is substantially shorter) - the games list for it might be relatively long, but when 90% are indie titles, then literally no one cares.

Just so happens that a lot of them indie games are very good and extremely demanding (because of ray tracing) though i.e. I'm currently playing the ascent and it's probably my GOTY atm, ray tracing really adds to the visuals. Also, there are a lot of big triple a games with dlss too and as shown from the latest microsoft, sony game show and nvidias youtube channel, there are going to be A LOT of games pushing graphics big time as well as utilising ray tracing so if pc gamers want to get the best visual experience at either 4k60 or 4k120hz or 1440 144hz etc., we need DLSS, FSR etc. to be added to games.

Aside the fact of FSR being still very young on the market (and soon the Intel's solution will have same problem) and that it took NVIDIA 3? years to get to the place they are now - it's still very subjective. Aside Cyberpunk 2077 (which I just NOW started to play, as just now it feels playable), I have 0 games that need DLSS on my 6800XT to actually run properly in 1440p ultrawide (so not far from 4k). CP could really use FSR too (CAS is nice but not the same quality for sure) - though going down from Ultra to Very High allowed it to be always above 60FPS, with minimal visual fidelity loss (as in, I can't see a difference without magnifying glass on stop frames).

People tend to forget that "Ultra" settings in games are added "for the future" and not for current gens - that's not new, it's been like that for many years now. As in, it kills FPS with hardly any visual difference, in many games, just so one could go back in few years on new GPU and say "Nice, I can finally run it on Ultra!" and then stop playing after 2 minutes, as even said Ultra usually looks nothing like "very high" of modern AAA games from that time. :)

EDIT: RT is a different matter but I simply don't care in the current state of it - maybe in 10-20 years, when it matures, games stop resembling highly polished mirrors and GPU performance will catch up with requirements for it.

People always say this, but xxx has been at it for xxx time, but at the end of the day, as a consumer who just wants to enjoy games now with the best overall IQ and good perf etc., I and most other "gamers" don't really care who has been at it for the longest, we just care about the here and now. AMD, intel or whoever's version will win out in xxx months/years but what good is that if a game comes out now, where I want to whack settings up and enjoy it right now?
 
Just so happens that a lot of them indie games are very good and extremely demanding (because of ray tracing) though i.e. I'm currently playing the ascent and it's probably my GOTY atm, ray tracing really adds to the visuals. Also, there are a lot of big triple a games with dlss too and as shown from the latest microsoft, sony game show and nvidias youtube channel, there are going to be A LOT of games pushing graphics big time as well as utilising ray tracing so if pc gamers want to get the best visual experience at either 4k60 or 4k120hz or 1440 144hz etc., we need DLSS, FSR etc. to be added to games.

So your saying that high refresh rate gaming at 1440 , you HAVE to upscale from 900p . Nope, thats a huge cludge for hardware that cant do it. i game at 6k downscaled to 4k
 
So you didnt watch any review and ignore every single of the outcomes, expressing that your own opinion is correct. Ok then. LTT, GN, JayZ and HUB (along with pretty much all the others expcet the bought and paid for techtubers) say the same - it depends on teh game to which highest end card wins as both the 3090 and 6900XT go blow for blow.

I never mentioned a 3090. You did. Then pulled out a review from that bloke who raged at Nvidia, and I think some of those others have before also. JayZ - I mean that video when he puts 1.5 volts or whatever it was, through a new Ryzen2 CPU then rages because it doesn't work. When it wasn't meant to run at that voltage. SO he as a trustworthy source went. Linus has just become too big and has loads of channels now none of which are much cop. Excpet Anthony - he's good - the rest of the crew like spending the weekend getting the taste of Linus's barse out of their mouths. GN - 500mph Steve.

@4k Nvidia king even without DLSS. the 3080 pulls the pants down on a 6900XT in some games.

That's what I said.

I did cheekily make it about 4k.

End of the day there are tonnes of 6800XT's in stock. So, if they are so good and beat Nvidia - why aren't folk buying them and leaving Nvidia stock? Because you have to 'fettle' AMD cards - I've been gaming for 35+ years and had many of both. AMD always require a fair time of muckaboutery -

People all want 4k gaming with a high hz panel - OLED icing on the cake. And for that - you'll want Nvidia.

Mic drop:D
 
People always say this, but xxx has been at it for xxx time, but at the end of the day, as a consumer who just wants to enjoy games now with the best overall IQ and good perf etc., I and most other "gamers" don't really care who has been at it for the longest, we just care about the here and now. AMD, intel or whoever's version will win out in xxx months/years but what good is that if a game comes out now, where I want to whack settings up and enjoy it right now?

That's all fair and a good consumer's approach - you choose and buy things that are useful for you NOW, not in some unknown future, as anything can be different by then. Still, it always takes time to spread new tech in the market and so far FSR is already growing considerably faster than DLSS initialy have - might not take 3 years to have it widespread, after all.

That said, DLSS and RT are reserved for tiny minority of gamers that actually have cards supporting either tech. Taking under consideration whole gaming market, where actual king is mobile and consoles and PC gamers are a tiny fraction of it - DLSS has no chances to dominate that market, because the hardware for it simply doesn't exist on it. That is, unless NVIDIA decide to free it up and let it work on other hardware, using DP4a for example (which I am sure they could do even now, but chose not to).
Even on the PC market DLSS-capable hardware is a tiny minority and will be for quite a while, even if AMD stops producing GPUs tomorrow and NVIDIA has 100% of the market for themselves. And because of that, devs won't be making games with it in mind - they add it, as an addon (if it's easy and cheap to add) but it won't influence design of the game. Which is why, aside CP2077 (NVIDIA sponsored title) I had 0 need to use it in any other game, so far - and I don't expect that to change anytime soon.
 
End of the day there are tonnes of 6800XT's in stock. So, if they are so good and beat Nvidia - why aren't folk buying them and leaving Nvidia stock? Because you have to 'fettle' AMD cards - I've been gaming for 35+ years and had many of both. AMD always require a fair time of muckaboutery -

People all want 4k gaming with a high hz panel - OLED icing on the cake. And for that - you'll want Nvidia.

Mic drop:D

Erm, you dropped mic on yourself, considering what Gibbo himself said earlier - people DO buy them (I mean I just have, when it was close to 10% cheaper a week+ ago), they sell very well (even considering very high price). And them being in stock in large numbers just means FINALLY AMD/Asus fixed production/delivery issues for UK, perhaps? :) I quote Gibbo: "One of our best sellers and we have 100’s in stock and more incoming so fingers crossed won’t be running out.". Though for what price is another matter...
 
Last edited:
A wall of utter bo*****s

yeah , no utter rubbish. Sadly you have been drinking the koolaid. AMD are matching Nvidia (depends on which sponsored title showcases which card, but in the remaining 90% of games they are evenly matched). Keep up with the shilling, its funny. Oh and no, you are wrong.again.
 
Fettle Amd cards? Lol. What does that even mean.

Pure nonsense. This generation I’ve used a 6800, a 6800XT and now I have a 6700XT. All of them have worked perfectly. Amusingly the only issue I did have was with a 3070 and getting it to work with my CX OLED properly.
 
Fettle Amd cards? Lol. What does that even mean.

Pure nonsense. This generation I’ve used a 6800, a 6800XT and now I have a 6700XT. All of them have worked perfectly. Amusingly the only issue I did have was with a 3070 and getting it to work with my CX OLED properly.

Oh, that reminds me of me fighting earlier this year to make NVIDIA card work properly with Samsung's Super Ultrawide monitor - it just would give me black screen every single time I tried to use full ress and refresh rate. It later turned out to be bugged NVIDIA drivers and we had to wait for a fix. By the time the fix came, we got rid of that monitor (it was just way too hot to sit in front of it for long, with HDR on). Took them quite a while to fix it on 3k series.
 
I would happily buy a 6800xt if it were closer to £850 which is where I deem the performance to cost over MSRP to be somewhat acceptable. But at basically £1000 and no AMD FE equivalents in the UK I will keep holding out for a 3080fe drop or wait until the 6800xt drops a bit lower before upgrading my 2060s
 
I would happily buy a 6800xt if it were closer to £850 which is where I deem the performance to cost over MSRP to be somewhat acceptable. But at basically £1000 and no AMD FE equivalents in the UK I will keep holding out for a 3080fe drop or wait until the 6800xt drops a bit lower before upgrading my 2060s

You are a bit late as literally last week they were (Asus TUF 6800XT OC) just a bit above your price limit and below £900 - which is when I got mine. Now the price is growing up again, though.
 
I have a 3080 but if I hadn't I wouldn't even consider a 6800XT. When dropping this kind of money on a GPU I want it to do more than just gaming. Nvidia has lots of features that AMD doesn't and being marginally faster in non raytraced games is not going to make up for that.

Out of curiosity - which are the other features that you use outside of gaming? I don't want to criticise you, just genuinely curious what else people do on these cards, sans gaming.
 
Out of curiosity - which are the other features that you use outside of gaming? I don't want to criticise you, just genuinely curious what else people do on these cards, sans gaming.
I use Blender. Cuda and optix are nvidia only features and sure you can render with opencl with amd cards but that is slow af. Lots of productivity software use cuda.
 
You are a bit late as literally last week they were (Asus TUF 6800XT OC) just a bit above your price limit and below £900 - which is when I got mine. Now the price is growing up again, though.

Yeah I had given up on getting a GPU so stopped checking stock across various places for quite a few months and just started looking as one of my monitors died so bought a 3440x1440 screen and not sure my 2060s can handle it. Probably would have just bought it for £900 too
 
I use Blender. Cuda and optix are nvidia only features and sure you can render with opencl with amd cards but that is slow af. Lots of productivity software use cuda.

Fair enough, I played a bit with optix on 2070S and that worked very well indeed, in Blender. CUDA is also useful more than OpenCL (even with "translators" in mind). Though, most gamers will never ever use either of that (nor the AI filtering for voice etc.), hence aside specific use for work, they wouldn't care.
 
AMD had one series that caused actual (proven) issues to many users - 5700 series. I had it myself for a while and after 6 months or so finally sold it. I swapped to 3 different AIBs models (XFX, Sapphire and Gigabyte) and all of them were very unstable, black screens etc. Each of these runs just fine on many other machines, so no issues with cards or drivers there. Odd thing, especially that 290 before and now 6800 and 6800xt, along with 3060Ti and 2070S work with 0 issues on the same machine.

That said, aside that unfortunate one series, I had no serious issues with AMD nor NVIDIA drivers for a long time now, though I still remember youtube crashing madly on NVIDIA drivers (1070Ti card) if I had it opened in more than 1-2 tabs at once (even if only 1 tab was playing).

Fiji - many driver issues
Polaris > many driver issues
Vega64 > Many driver issues
>Radeon VII (Vega based) > many driver isssues
RDNA1 (rx5700) > huge number of driver issues

RDNA2 is a step in the right direction, but many have just lost trust in AMD and don't wanna take the risk.

People have voted with their wallets, more 3000 series cards have been sold overall, for a good reason.
 
Back
Top Bottom