• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Rx 6800 good enough?

You're not going to run CP2077 even at medium settings and reach 4K 60 with a 6800 (more like low 40ish fps at best). And the things you say to turn down for performance gains barely cost any in the first place (chromatic aberration et all, all post-processing cheap effects which will be in the <=1 fps range of perf hit).

I see you are moving the goalposts to imply 60FPS is a minimum requirement at 4K now, even though a 3080 is not doing that either?

I'm telling you right now that I did run it at 4K in CP2077 with 40ish FPS lows and more like 50FPS average with turning most of those effects off and shadows medium. The 3080 runs it barely much faster unless you enable DLSS. Mainstream reviews would tell me I should get 47 fps minimus at ultra 4K on my 3080FE in WDL, yet my own experience in very demanding scenes IS mid 30 FPS minimums. I can see and feel it dropping out of my Freesync range with ultra and that is well below the 47 reviews claim. Reviewers play a short stint or even run a canned benchmark and post some results that more often than not fail to match reality. It is why you sometimes see drastically different results on the same game from different reviews.

Even Digital Foundy fouind they got a 35% increase in performance in CP2077 just by optimising the setting from best to worse impact comapred to IQ.

Do you understand how condescending and arrogant you sound claiming those of us with experience of the 6800 are wrong? Oh and you are wrong about motion blur, bloom and DoF being pointless gains, disable them all will give you close to ~8% extra FPS on the 1% lows. Unless you are claiming 8% is nothing? The difference between shadows ultra and high in WDL is ~25% and the IQ is hardly noticeable unless you stop to compare.

So yes, 4K is playable in modern games on these GPU with some compromises on settings. That is as true on my 3080 as it was on my 6800.
 
Last edited:
Again, you are wrong. I have an RX 6800 , and in the games i play at max or near max settings i play at 4k/60 . Quoting a review doesnt hold water , especially since the results vary so much on all of them. RT is also a gimmick and i leave it off. Do you actually own the latest generation of gpu ?

I do in fact have a 6800 myself.

I'm telling you right now that I did run it at 4K in CP2077 with 40ish FPS lows and more like 55FPS average with turning most of those effects off and shadows medium. The 3080 runs it barely much faster unless you enable DLSS. Mainstream reviews would tell me I should get 47 fps minimus at ultra 4K on my 3080FE in WDL, yet my own experience in very demanding scenes IS mid 30 FPS minimums. I can see and feel it dropping out of my Freesync range with ultra and that is well below the 47 reviews claim. Reviewers play a short stint or even run a canned benchmark and post some results that more often than not fail to match reality. It is why you sometimes see drastically different results on the same game from different reviews.

You're arguing with yourself atm because my initial post has nothing of what you're replying to, so I will only address the CP2077 part. You're not gonna get 4K 60 with a 6800 no matter what you turn off. Maybe if you mod it to pre-PS3 era graphics, sure, but it would be completely stupid. Best it can do at medium is 44 avg. And don't take anyone's word for it, here's hundreds of indepedent users benchmarking it on various settings & hardware combos (https://www.computerbase.de/2020-12/cyberpunk-2077-community-benchmarks/).

5VGRjSp.jpg.png
 
I honestly don't believe you have a 6800, or you are you just don't know how to opitmise it (or games). DF shows an opimisation guide that with mostly high settings in non RT allowed me to reach ~35% extra performance on the 6800. That along with an 10% OC brought it from low 30s in minimum FPS to low 40s for minumum FPS. The average also jumped from mid 40s to 50-55 FPS in most cases.

This is using a mix of medium/high/ultra settings with the usual suspects such as DoF etc disabled. So you can show early release day performance reviews all you want, it doesn't change what I and others have already said we experienced.

Edit: Here is a guy posting his 6800XT performance at 4K after doing the DF optimisations. He is getting 50 - 60+ FPS in most places. Take ~ 15% off for a 6800 and you are right in the ballpark of what I experienced, even better if you Undervolt and OC that 6800.

 
Last edited:
Reading comprehension is a problem for you i see. As i said , for the third time, i game at 4k/60 IN THE GAMES I PLAY. I care nothing for your BS. Also i dont believe you have a 6800 given the level of bitching you are doing.

His own community benchmark link is contradicting his own statements and proving my claim that reviews have such a high variance it makes them almost invalid for real world cases. If you look at the 4K medium setting chart you can see 6800 XTs ranging from 50 - 58 FPS, while we are to believe the ~15% slower 6800 is only going to get 43 at best. Or up to 25%-30% slower than a 6800XT. The maths just don't add up and doesn't match the real world experience most here are posting as evidence.

All I can assume is that those saying the 6800 is not a capable 4K card are not using VRR monitors. 4K gaming is pointless without VRR in my experience. Strangely no review site ever tests how games perform and feel with VRR in their GPU reviews. At 4K the 6800 is a capapble card if you have a decent VRR monitor because with some minor compromises it will maintain 40-60 FPS. If you don't accept compromises or don't have have a VRR monitor, then pretty much no GPU available at the moment is capable of 60Hz 4K ultra settings in modern games.
 
Last edited:
Everyone one bickering about 4k60. I'd rather play at Ultrawide 120hz. Much more immersive experience.

If your gaming at 4k on anything 27" or smaller then we'll done you've wasted your money. If you do game at 4k and can't max everything out just turn down resolution scale to 80 or 90%. The difference in image quality is minimal and the gains are often huge.

Refresh rate trumps resolution every time for me.
 
I have a 32" 4K 60Hz Freesync monitor and a 1080p 144Hz Freesync and I prefer the 4K hands down. Though as you say it's all down to personal preference.
 
You can't discount demanding games for counting under 4K gaming then go ahead and say "yeah, it's 4K card". Let's talk numbers and specific games (see below).


You're not disagreeing with me, you're disagreeing with reality. This is the performance of the card:

(1%/Avg)
Godfall: No. (42/51)| WDL: No (37/42)| ACV: No. (38/48)| Dirt 5: Yes. (62/69)| Death Stranding: Yes. (85/93)| MSFS2020: Heck No. (27/29)| Gears 5: Yes. (52/63)| HZD: Yes-ish (51/60)| ACOd: Yes-ish (43/63)| Anno 1800: No. (41/44)| Black Mesa: No. (43/48)| BL3: No. (43/46)| Control: No. (33/37)| Detroit BH: No. (52/58)| GR:BP: No. (29/33)

Cyberpunk is an extreme example, also not the be all or end all of what should determine a GPU purchase. I bet you have not tried enabling Fidelityfx either. By the way 6800 gets 60+ FPS in Godfall at Epic settings 4K according to this.


Disagree! RT in Control and CP2077 at 4K looks bloody brilliant. Thats why I got an ampere not a navi ;)

CP2077 is playable with Raytracing effects with AMD's 6800 non-XT in ultrawide at high settings with Fidelityfx on. Control is nothing more than a VFX showcase game which I don't think many of us care much about.
 
This shows the 6800 is still very capable at 4k:


Thank you for posting and showing the reality for potential 6800 buyers. The problem I see is those proclaiming it a poor 4K card, are focusing on CP2077 as if it is the ultimate goal for all GPUs. Even CP2077 is playable at 4K with mostly high settings (no RT of course) but the focus seems to be "if it isn't maxed out 60FPS it's a fail". Even the RTX 3080 or RTX 3090 cannot run CP2077 maxed out at 4K with DLSS and RT off.
 
Thank you for posting and showing the reality for potential 6800 buyers. The problem I see is those proclaiming it a poor 4K card, are focusing on CP2077 as if it is the ultimate goal for all GPUs. Even CP2077 is playable at 4K with mostly high settings (no RT of course) but the focus seems to be "if it isn't maxed out 60FPS it's a fail". Even the RTX 3080 or RTX 3090 cannot run CP2077 maxed out at 4K with DLSS and RT off.

Cyberpunk is the new Crysis, we all had this argument in 2007 :D
 
Well they release some really decent 4k / 144hz / ips panels a lot now.

The problem is nothing can run modern "AAA" games at that refresh, and the issue with 4K is that nothing ever will. As soon as a card comes out that allows max settings 4k60 locked in 6 months that card can't do it anymore. 4K 144hz is a pipe dream in the latest releases. Devs are pushing graphics tech faster than AMD/Nvidia can release cards. MCM may be the answer but I feel that 4k60 will always be that dream people are chasing and never quite getting there as Devs just find another power hogging technique or setting they can add or turn up.

I suppose that's why we are enthusiasts though.
 
I have a 32" 4K 60Hz Freesync monitor and a 1080p 144Hz Freesync and I prefer the 4K hands down. Though as you say it's all down to personal preference.

I probably agree that at 32“ and above 4k is definitely preferable but going from 144 to 60 feels horrible to me. Like you said it's personal preference. I personally have a 34" 120HZ 3440X1440 Ultrawide which for me is the screen real estate vs pixel density sweetspot. I prefer the additional horizontal fov over the additional vertical on a 16:9 32". I have no idea how some people use 40" 16:9 monitors. You'd be dead by the time you look around the screen .
 
Why are they so expensive on ocuk? I can literally put in a pre order for 599 on another website I found. Yeh they don't have any at the moment (probably because ocuk have them all) but the price is fixed and it won't go up. Just don't understand the justification for some of the ridiculous pricing I'm seeing.
 
The problem is nothing can run modern "AAA" games at that refresh, and the issue with 4K is that nothing ever will. As soon as a card comes out that allows max settings 4k60 locked in 6 months that card can't do it anymore. 4K 144hz is a pipe dream in the latest releases. Devs are pushing graphics tech faster than AMD/Nvidia can release cards. MCM may be the answer but I feel that 4k60 will always be that dream people are chasing and never quite getting there as Devs just find another power hogging technique or setting they can add or turn up.

I suppose that's why we are enthusiasts though.

As a 4K owner ever since 4K was a thing I can confirm this to be accurate. I use 4K because my main reason for PC is gaming with significant Photoshop thrown in. So 4K at a decent size is a must for texture creation. Prior to VRR you needed to get as close to 60FPS to get smooth gameplay, or you had to put up with tearing and or stuttering.
  • When 980Ti was top dog it could not push Witcher 3 anywhere near 60 FPS average, even with reduced settings.
  • When GT1080 was top GPU it had Deus Ex Mankind Divided that brought it to it's knees. The 1080Ti helped a bit but the same game was not close to 60FPS average on ultra settings.
  • 2080Ti, massively overpriced but allegedly the first true 4K 60Hz GPU. Except you now had Ray Tracing to contend with (when some actual games where eventually released).
  • RTX 3080 and even the 3090 can't push CP2077 RT, or some other recent non-RT titles to 60Hz 4K ultra.
So there is always some new game out there that destroys the "top end" GPUs at 4K.

When driving 4K in the latest and most demanding games, it has always been 100% necessary to reduce settings to achieve playable FPS at 4K. Honestly the biggest impact to 4K gaming is not GPU power, but the introduction of VRR. The 4K 32" Freesync screen I use has a 33-60Hz VRR range. Without VRR even 45-50 FPS can be unplayable, with VRR even mid 30s can be playable.

So for me the next biggest thing in 4K is not higher refresh (it would still be nice of course), but HDR.
 
Thank you for posting and showing the reality for potential 6800 buyers. The problem I see is those proclaiming it a poor 4K card, are focusing on CP2077 as if it is the ultimate goal for all GPUs. Even CP2077 is playable at 4K with mostly high settings (no RT of course) but the focus seems to be "if it isn't maxed out 60FPS it's a fail". Even the RTX 3080 or RTX 3090 cannot run CP2077 maxed out at 4K with DLSS and RT off.

It's the gamecache, it's fine for lower resolution but it starts to choke performance at 4k or higher - the card is still capable at 4k, just it's less optimal when the gamecache can't feed the shaders so the card falls back to the gddr6 which has too little bandwidth from its mid range bit rate
 
I have a sapphire 6800XT nitro oc and I play cyber punk at 1440p it just feels much smoother and better. It looks amazing too. 4k felt ok but it's just a better experience for me at 1440p. Don't mind at all.
 
Back
Top Bottom