• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Are RX 6800 XT not wanted?

Kei

Kei

Soldato
Joined
24 Oct 2008
Posts
2,750
Location
South Wales
Vega64 > Many driver issues
I would beg to differ on this one. I've had my Vega 56 since release and not had a single issue with it. Sure, some people will have issues with anything as hardware combinations are all different. The GTX 780 I had prior was great but I had one of the most irritating driver issues I have ever encountered which caused endless black screens. Intel X79 + a windows 10 update and Kepler did not get on when virtualization was enabled in the bios. Once I figured that out after nearly 6 months no gaming or hardware acceleration, it worked great again. I'm still using the 780 as a CUDA accelerator 8 years after buying it.

I've had little in the way of trouble with either AMD/ATi or nvidia in the last 25 years of gaming. With current prices, even though I can easily afford them, neither AMD or nvidia are getting my money. I am not paying £500+ for a peasant spec card.
 
Associate
Joined
31 Oct 2009
Posts
854
Location
in the tower
I would beg to differ on this one. I've had my Vega 56 since release and not had a single issue with it. Sure, some people will have issues with anything as hardware combinations are all different. The GTX 780 I had prior was great but I had one of the most irritating driver issues I have ever encountered which caused endless black screens. Intel X79 + a windows 10 update and Kepler did not get on when virtualization was enabled in the bios. Once I figured that out after nearly 6 months no gaming or hardware acceleration, it worked great again. I'm still using the 780 as a CUDA accelerator 8 years after buying it.

I've had little in the way of trouble with either AMD/ATi or nvidia in the last 25 years of gaming. With current prices, even though I can easily afford them, neither AMD or nvidia are getting my money. I am not paying £500+ for a peasant spec card.

I sold my Vega for the price of a 6700 XT and the cost was nothing , i would sell the Vega while the prices are high as they are sought after due to miners
 
Caporegime
Joined
12 Jul 2007
Posts
40,520
Location
United Kingdom
If you like spending hours running benchmarks, tweaking everything off in windows, overclocking every last thing so the PC is only stable enough to the end of the benchmark etc - just to get a bigger number in some thread - you crack on. Then turn it all back down again to play games!

I have used them just to make sure my card is in the ballpark of similar cards when getting a new card.

Though I'd rather watch paint stay wet than entertain 5 minutes of my life doing that. Each to their own - I don't doubt it'll beat mine. Should do for what it costs over mine! You've probably spent hours tweaking and running multiple benchmarks (couldn't think of anything more boring myself) so you can say you have the biggest number in benchmark threads!? I really cant see the appeal of it. But it's not what I was on about. I'm on about 4k game benchmarks

But lets get back to the crux of what I originally wrote concerning gaming at 4k. Yes the 6900XT and the 3090 will beat my 3080 - but for the extra £££ they cost - the 3080 @4k - can still @4k beat a 6900XT in a game or two @4k. And the 6800XT which is what you said originally ON THE WHOLE beats a 3080. WHat I said concerns 4k. On the WHOLE at 4k the 3080 is better.

Witcher 3 @ 4k - 3080 has 6900XT's pants down. Shadow of the tombraider,- yes we can all cherry pick and that's what I was doing - just as you are cherry picking synthetic benchmarks you spend hours just to raise a number, just beyond me that there is any fun in that.

I understood what you said because of where you work - and I purposely cherry picked 4k because I know there was a game or two where the 3080 beat the 6900XT - that was all.

It was funny though seeing the fanbois go mad over it and ignore people, in their rage fail to comprehend that I was on about 4k.

ANd I never even mentioned DLSS or raytracing :cry:

HAppy benching:cool:
My Timespy overclock is only 25Mhz higher than my max stable gaming overclock, that’s the good thing about RDNA2, much lower power usage brings much higher OC potential vs Ampere. Not uncommon to get 400Mhz+core clock overclocks with RDNA2, plus there is No throttling memory temps to worry about.

The Witcher 3 is a old DX11 game, made by the same developer as Cyberpunk. Close ties with a certain vendor.

The same game that ran excessive tessellation and had gameworks code (HBAO+ etc) hidden behind black box dlls. That’s partly how the tessellation slider was born to workaround some of the shady practices.

Those tess sliders were eventually added into the Witcher 3 video menu at a much later date by the developers as it was discovered you could massively increase performance for no loss in image quality when using AMD GPUs but not on the competition. That got fixed though funnily enough. It was all about the hair (Geralt), just as it was the cape with Batman, grind down that performance if it hurts the competition more.

It’s no surprise at all that a 3080 is faster at 4K than the 6900 XT in this scenario, in this game.
For the most part though, the 6900 XT is faster than the 3080, and going by the game and synthetic benchmark threads on this very forum, at times it is significantly faster when both are overclocked.

Sure you can degrade image quality and introduce blur with image reconstruction, then enable RT to give the 3080 a win in a few sponsored titles, won’t argue with you on this point. In that scenario 3080 would win at 4K more often than not in said games. I also won’t argue about the 3090/6900 XT not offering worthwhile performance gains over RRP 3080/6800XT.

By the way, the 4K hardware unboxed numbers that were mentioned earlier don’t even have smart access memory enabled so that’s 10% performance or thereabouts you can add on for the RX6000 series. Not sure why they disable it when it’s free performance and no negative to enable it.

Please post your Shadow of the tomb raider benchmark scores (as you referenced that game in your response)seeing as we have a benchmark thread for it on OcuK. I think you’ll find a 6900 XT stacks up a bit better than it does in the Witcher 3 at 4K vs a 3080.
 
Soldato
Joined
24 Aug 2009
Posts
2,931
6800xt is a great performer. I upgraded from an rtx 3070 to a 6800xt due to going 34” ultra wide and the performance bump has been fantastic.

don’t get me wrong, I only play escape from tarkov, a game that’s poorly optimised but I will take as much FPS as I can.

looking at bench marks, the 6800xt wins more that the 3080, but like people said depends on the game and what gpu it favours too.

I think amd have nailed it the past year or so with the gpus and cpus
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
So you cant back up your claims then. Ok son.


You can't even quote people correctly -

This is an internet forum where people can write and have an opinion of what they like. You seem to think you can change it to whatever your agenda is. Then easily offended if someone writes something you don't like.

WHat can't I back up? YOu still dont get it was 4k I was on about. You keep adding nothing other than vitriol.

Have a lie down - take a chill pill, use ignore if the 'lil 'ol internet hurts your feelings so bad. AT least Matt has a conversation about it.

You like a petulant child. At least you'll be back at school next week.
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
My Timespy overclock is only 25Mhz higher than my max stable gaming overclock, that’s the good thing about RDNA2, much lower power usage brings much higher OC potential vs Ampere. Not uncommon to get 400Mhz+core clock overclocks with RDNA2, plus there is No throttling memory temps to worry about.

The Witcher 3 is a old DX11 game, made by the same developer as Cyberpunk. Close ties with a certain vendor.

The same game that ran excessive tessellation and had gameworks code (HBAO+ etc) hidden behind black box dlls. That’s partly how the tessellation slider was born to workaround some of the shady practices.

Those tess sliders were eventually added into the Witcher 3 video menu at a much later date by the developers as it was discovered you could massively increase performance for no loss in image quality when using AMD GPUs but not on the competition. That got fixed though funnily enough. It was all about the hair (Geralt), just as it was the cape with Batman, grind down that performance if it hurts the competition more.

It’s no surprise at all that a 3080 is faster at 4K than the 6900 XT in this scenario, in this game.
For the most part though, the 6900 XT is faster than the 3080, and going by the game and synthetic benchmark threads on this very forum, at times it is significantly faster when both are overclocked.

Sure you can degrade image quality and introduce blur with image reconstruction, then enable RT to give the 3080 a win in a few sponsored titles, won’t argue with you on this point. In that scenario 3080 would win at 4K more often than not in said games. I also won’t argue about the 3090/6900 XT not offering worthwhile performance gains over RRP 3080/6800XT.

By the way, the 4K hardware unboxed numbers that were mentioned earlier don’t even have smart access memory enabled so that’s 10% performance or thereabouts you can add on for the RX6000 series. Not sure why they disable it when it’s free performance and no negative to enable it.

Please post your Shadow of the tomb raider benchmark scores (as you referenced that game in your response)seeing as we have a benchmark thread for it on OcuK. I think you’ll find a 6900 XT stacks up a bit better than it does in the Witcher 3 at 4K vs a 3080.


As I said - I was just cheekily cherry picking where a 3080 beats a 6900XT. Witcher 3 may be an old game some some reviewers still use it. There was no RT used. I know it favours Nvidia - that and TR I purposely picked. It doesn't really matter if its an old game.
I have been vaceacious, yes. Purposely pulling out cherry picked stuff like others have. It created banter but some time folk have become so offended - that they spend so much of their time getting so mad.

Tombraider - I'd have to DL the game. I looked at a guru3D game review for 3080Ti - that is all.

AT the end of the day - when playing any game at any resolution - no one would be able to notice if they were using a 3080/3090/6800XT or a 6900XT. AT any resolution. The difference in FPS would be unnoticeable to the eye.

Must be people who bench and want the biggest number, thinking their cards are MUCH faster than someone else's when in fact very few could pick it out in ablind test when using gfx cards for their intended purpose. I'm sure you've had more than one merc in the search of the best of that range, no?


So the statement you gave as an AMD employee will always be horrendously biased and you even knock down your competitor saying that RT has something to do with image quality. It doesn't - that's DLSS - RT is to do with reflections and light - Control does a great job with it - it's in it's infancy but still, because AMD don't have it then it must be rubbish. Finishing what you say with, anything other than your view is fantacism and mindshare - you'll be a politician next.

You are an AMD employee and it is in your best interests that your input is to put AMD in the best light and knock Nvidia, I get that. I just took the chance of a jibe in retort, and the response from pro AMD folk was more in the name of mindshare and fantacism. Still I guess the midshare and fantacism is more prominent concerning AMD as the Nvidia positive people didn't become abusive with what they wrote - but still met with vitriol.

6800 XTs are faster than 3080s on the whole, unless you enable Ray Tracing, a feature that provides very little (some would say if any) improvement to image quality in current games. However, mindshare and fanaticism are harder obstacles to overcome.


So the OP still stands - if these 6800XT's are so good - why are they all on the shelf at OCUK?

As we know, from a gamers point of view there is no discernible difference you'll feel at your monitor - you just wont notice it without an fps counter - even then most games it's singular fps between the top end cards.

Must be that most people are under the Nvidia mindshare and fantacism spell?

I dunno, people going mad at some numbers that in the real world of gaming. Could you see the difference between an average of 110/123 or 195/215 - choose any res and any game.


So, if the cards are no difference performance wise - what does one offer than the other cant? Decent DLSS (If DLSS was crap - AMD wouldnt have attempted FSR) and RT compared to AMD's offerings. Maybe that's why Nvidia cards are more sought after.

Prepare for the AMD fanbois Rage at what I wrote as I'm obviously under the influence of mindshare and fantacism. Yay!
 
Associate
Joined
23 Oct 2019
Posts
484
After the way AMD treated the UK GPU market these past 8, 9 months they deserve any criticism their way, can't really compete with a 3080 sold for 650 quid. 6800XTs don't sell for 900-1000 because it's still a blatant rip off, I was actually looking for an AMD card last year but it's clear they don't care about UK customers.
 
Associate
Joined
7 Jul 2019
Posts
40
After the way AMD treated the UK GPU market these past 8, 9 months they deserve any criticism their way, can't really compete with a 3080 sold for 650 quid. 6800XTs don't sell for 900-1000 because it's still a blatant rip off, I was actually looking for an AMD card last year but it's clear they don't care about UK customers.

I mean as a business decision I get it. They have limited stock of MBA cards and it's probably a hassle with Brexit setting up to sell them within the UK and why bother when you can just ship them to the EU and sell them all anyway. I think under "normal" circumstances that we wouldn't really care about that if we could pick up aib models for the usual increases over MSRP it's just a big issue at the moment as you cannot pick any aibs for a good price and so MBA and FE cards look way more appealing to people than usual and at the moment you can only buy Nvidia in the UK.

I would say I am red or green team agnostic and simply look at my gaming needs, extra features and price generation to generation to decide on what to buy but as I can only get a 3080 (ideally) 3080ti or 3090 at MSRP it really skews my decision in favour of Nvidia and that's not even including dlss or rt which for me are not needed but a nice bonus. If I could get an MSRP 6800xt it would have been team red I probably would have gone for
 
Caporegime
Joined
12 Jul 2007
Posts
40,520
Location
United Kingdom
As I said - I was just cheekily cherry picking where a 3080 beats a 6900XT. Witcher 3 may be an old game some some reviewers still use it.
By some you mean Digital Foundry? Yes, that is not at all surprising. Using a game that is in no way indicitive of current game performance on the whole as a way to measure performance using the latest CPUs and GPUs.
There was no RT used. I know it favours Nvidia - that and TR I purposely picked. It doesn't really matter if its an old game.
Agreed, I never said it does use RT though. Just thought it would be worthwhile to mention the developer as you specifically cited this game in your post.

However, it does matter if it's an old game (see the reason above) and my statement that you quoted does not take into account certain older games like this using legacy APIs and heavily favouring certain GPUs.

I have been vaceacious, yes. Purposely pulling out cherry picked stuff like others have. It created banter but some time folk have become so offended - that they spend so much of their time getting so mad.
Well, no offence taken here. :)

Tombraider - I'd have to DL the game. I looked at a guru3D game review for 3080Ti - that is all.
Okay, only mentioned this as you cited that game. It favours Nvidia, but the 3080 is not faster than the 6900 XT at 4K in this title unless you enable RT. Also, RT was tacked onto the game well after it was released.

AT the end of the day - when playing any game at any resolution - no one would be able to notice if they were using a 3080/3090/6800XT or a 6900XT. AT any resolution. The difference in FPS would be unnoticeable to the eye.

We've always agreed here and I've never really argued otherwise.

Must be people who bench and want the biggest number, thinking their cards are MUCH faster than someone else's when in fact very few could pick it out in ablind test when using gfx cards for their intended purpose. I'm sure you've had more than one merc in the search of the best of that range, no?
It's a hobby for some, each to their own. I spend 99% of my time playing games and just a small amount of time running benchmarks. Once you've got the best score what else is there to do?

So the statement you gave as an AMD employee will always be horrendously biased and you even knock down your competitor saying that RT has something to do with image quality. It doesn't - that's DLSS - RT is to do with reflections and light - Control does a great job with it - it's in it's infancy but still, because AMD don't have it then it must be rubbish. Finishing what you say with, anything other than your view is fantacism and mindshare - you'll be a politician next.
I could have used a better word than fanaticism.

My point was that RT offers very little improvement to image quality for a high FPS cost. Some people feel the same way, some don't. It's very subjective.

I'm not sure how you came to that conclusion on a 'competitor' based off what I posted. Fact is that Image reconstruction does reduce image quality. People either don't want to believe it or are blissfully ignorant to the various well known drawbacks. Of course there are some positives too, but native is king - for now.

You are an AMD employee and it is in your best interests that your input is to put AMD in the best light and knock Nvidia, I get that.
No, I'm just an enthusiast of PC hardware and I use what I know. I gain nothing from being critical or positive towards anything. This is a huge misconception because I used to be the AMD Community rep here. It's essentially just used as a retort to dismiss my opinion if i speak negatively towards Ray Tracing or image reconstruction.
I just took the chance of a jibe in retort, and the response from pro AMD folk was more in the name of mindshare and fantacism. Still I guess the midshare and fantacism is more prominent concerning AMD as the Nvidia positive people didn't become abusive with what they wrote - but still met with vitriol.
Half a dozen of one tbh as always with these type of things. My fanaticism comment was unnecessary, but the mindshare part was valid.

People are entitled to have different opinions that's healthy. The problem comes when people can't accept someone else's opinion, even if it is wrong in their eyes - and there was definitely some of that when i gave my initial opinion.

Which I still stand by, but there will always be corner cases and examples where my opinion is wrong as discussed above.

So the OP still stands - if these 6800XT's are so good - why are they all on the shelf at OCUK?
As we know, from a gamers point of view there is no discernible difference you'll feel at your monitor - you just wont notice it without an fps counter - even then most games it's singular fps between the top end cards.
Must be that most people are under the Nvidia mindshare and fantacism spell?
I have no insight into stock levels and sales at OcuK or elsewhere. Looking at the financial records thread from results that have been publicly posted at Techpowerup, it appears that RDNA2 has been very well received.

I dunno, people going mad at some numbers that in the real world of gaming. Could you see the difference between an average of 110/123 or 195/215 - choose any res and any game.
Without an FPS meter and as long as the FPS are within a certain range I'm sure most of us couldn't tell the difference when the numbers are close enough.
 
Last edited:
Associate
Joined
7 Apr 2017
Posts
1,762
Nvidia cards you can still get FE models at MSRP. On principle I'd not touch an AMD card at the ridiculous prices they are asking. They are also barely faster and FSR lacks supported games.

It's a hard sell when you factor everything in.
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
My point was that RT offers very little improvement to image quality for a high FPS cost. Some people feel the same way, some don't. It's very subjective.


I'm sure you mean DLSS and not RT? RT adds reflections from surfaces - yes at a high fps cost. DLSS is resampling of image and in some implementations don't do it well. Death stranding does it well and and the end of the day it's adjustable. I'm surprised the slating RT and DLSS has when it's only there to help those where required with lower tier cards. Higher FPS due to DLSS in a first person shooter will be more welcoming to a player who's shooting at people, but not looking in minuscule detail as to the difference in IQ when standing still in an fps game. If it's much worse and poor - turn it off. like many of the game parameters. Some game settings are there to enhance low resolutions - so at high resolutions they are low or off. Fact remains that Nvidias implementation of both DLSS and RT at the moment is better than AMD.

I don't know how wow people are expecting to be by adding RT screen space reflections. No one walks down the highstreet wowing at their own reflections and everything else in the shop windows - because they are used to it. RT just adds another natural physical parameter and is more realistic concerning the physical interaction of light on a surface and for some part sounds. Same as physX does with destructible environments et al. RT physics is just another step to including all physical interactions in the real world,to the game world. Better reflections working off of more surfaces.

IN a natural environment - maybe only water reflects so not much visual impact. In control which is indoors with many reflective surfaces - then it has more visual impact. Still not the B all and end all I know. But nice to have that interaction of light off of reflective or very smooth surfaces more than once. Playing GAmes like farCry with a lot of earth like outdoor areas in nature there aren't many natural reflective surfaces, so you can't get a reflection off of sand or concrete (unless polished). Look at that shiny tree! Urban games more so where in the built environment and particularly futuristic ones - much more opportunity for the artist to specify reflective surfaces. I think a lot of people expected to be wowed by it when looking at a game with few reflective surfaces - thus a minimal RT impact. Indoor game with lots of glass and reflective surfaces it's more apparent. I never expected it to blow the original image away - RT is just a process for bouncing light off of more than one surface and indeed sound. JUst includes more physics in a game from the real world.

So to see the first implementations of another physical parameter added to game graphics, namely RT, to be met with such negativity and to say it's crap etc is just very odd - People that don't understand physics and what RT is trying to do are just ignorant. People need to watch a Pixar film where everything is raytraced (though not in real time as the user/watcher has no control of movement) - that's why the interactions look more realistic. Took a long time to get it into film - and it will come to games fully eventually. RT wont knock your socks off unless you've never expericed refelctions before Like a baby with a mirror.
I cant understand what peoples expectations of real time reflections was regarding the expectation of being blown away by it. That's why the shiny balls are used as a demo and not a forest.

And I didn't mean you at all when talking about being offended, you always like the bants - hence why you post sometimes knowingly on that thin line.

It was those that added nothing except vitriol at others I was addressing.

Now, how many 6900XT mercs have you had?
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
One thing against the Nvidia cards is the GDDR6X memory. It runs hot and there is no real world difference (AFAIK) in performance on a gfx card (yes it is meant to be twice as fast) but I've never seen a measurable apparent uplift from 6X when used on a gfx card.

But GDDR6X is also VERY expensive - so whats the deal with AMD charging what they are when they only have cheaper GDDR6 on them?

That saving could've surely made them much more competitive in the pricing arena. Maybe the GPU is more expensive to manufacture, or just getting that market share number up with sales figures. I do wonder why AMD's prces are sky high ATM
 
Caporegime
Joined
12 Jul 2007
Posts
40,520
Location
United Kingdom
I'm sure you mean DLSS and not RT? RT adds reflections from surfaces - yes at a high fps cost. DLSS is resampling of image and in some implementations don't do it well. Death stranding does it well and and the end of the day it's adjustable. I'm surprised the slating RT and DLSS has when it's only there to help those where required with lower tier cards. Higher FPS due to DLSS in a first person shooter will be more welcoming to a player who's shooting at people, but not looking in minuscule detail as to the difference in IQ when standing still in an fps game. If it's much worse and poor - turn it off. like many of the game parameters. Some game settings are there to enhance low resolutions - so at high resolutions they are low or off. Fact remains that Nvidias implementation of both DLSS and RT at the moment is better than AMD.
No, i definitely mean RT. The performance cost of the various RT effects (reflections, shadows etc) often look worse or only marginally better than the alternative. That goes for the AMD titles with RT too, though at least in those the framerate doesn't get completely devestaed. So your framerate gets a 125% performance hit (this % was taken from the Ascent, using a RTX 3080 with RT on vs RT on with DLSS for very little improvement to overall image quality. Take a look at the RTX thread where i circle a highlight around once of the images posted that shows you the miniscule improvement to image quality that costs so much performance.

As soon as RT makes significant improvements to image quality, without requring image reconstruction I'll be on board. Or image reconstruction has to get better, otherwise I'll just dial those settings off and play native and enjoy my high FPS.

Btw, any player worth his salt would never use image reconstruction in a competitive first person/twitch shooter, where lowest latency is king and image blurring etc is frowned upon.
And I didn't mean you at all when talking about being offended, you always like the bants - hence why you post sometimes knowingly on that thin line.
I take it back, I'm now offended that you didn't think i was offended. :p
Now, how many 6900XT mercs have you had?
3. :D

1 bought as a temporary gpu b grade so was only ever going to keep it till i could get one brand new with warranty. Sold at cost on the MM.

Got a brand new one a month or two later. Then Djay offered me his Merc at asking as his was a better bin.

Given the GPU market at the time, i was able to buy his at cost and move mine on at cost (all on the MM to grateful gamers btw) with no loss to me other than £20 odd quid on postage costs so thought it was worthwhile.
 
Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
Btw, any player worth his salt would never use image reconstruction in a competitive first person/twitch shooter, where lowest latency is king and image blurring etc is frowned upon.

I dunno, what with crossplay and so many in a server with massive pings - I think most people have forgotten about latency. Bring back the 120hz servers with auto kick pings >100. (or 60 for Europe). COmpetitive will be LAN or invite only - the rest is just dumbed down and just get on with it servers owned by game franchises. Battlefield 4 was the last game I saw where you selected the server frequecy right up to 120hz requiring players to be more local. Great when the game was popular but hard to find once it lost popularity.

Many try and play the game on the otherside of the world - a laggy ping gives an advantage of being harder to hit. On euro servers - anyone with a 70-80 ping I cant hit vs my <20 ping. Nemesis ping. Asia playing on euro servers in PUBG where it's low TTK - 'kin nightmare. Ruins the game for all. Only decent servers are adminned making the overall latency as close to LAN as possible.

I play Insurgency sandstorm as my shooter - and the amount of people that play with 200+ pings as they don't set the game region is mind boggling - rubberbanding all over the place on screen - then call the game rubbish because they can't hit anything. Making the server lag as it has to resolve to the highest ping. When you tell them they'd have a better game on a server nearer them, they argue that ping/latency doesn't make any difference. Or as 'Murican's say "Doesn't matter, Doesn't matter.

I think low latency ONLINE gaming went when lots of private servers went. Though many will think they are playing competitively and know nothing about latency - IQ, fps, monitor refresh - and seeing as 1080p is still 90% res for most gamers - I'm unsure as to how much latency adding in DLSS will.

Dunno how much image adjustment at the machine end would add to server latency. Nothing. Just be harder to see detail. Though competitive gamers turn most gfx settings to low so a bit of a moot point. Great image quality at native res on one hand - then all turned off because they want a competitive edge with low IQ.


PS - link to that RTX thread as interested - I have looked honest, but cant find it. CHeers
 
Last edited:
Back
Top Bottom