• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Did Nvidia Blow Their Performance Advantage?

It's good job AMD. They have definitely over delivered this time and really impressed beyond expectations.

I don't think Nvidia expected it to be honest. No one did.

It's excellent for the community to have two rivals trading blows.

However the key will be consistency. Nvida have that history of being consistent, AMD do not so that's the next hurdle for AMD, can they do it next time round and the time after that etc...

This is why I am wary buying in to an AMD ecosystem with a FreeSync monitor. AMD do not have the historical record behind them when it comes to pushing the GPU boundaries every generation. If I am going to be tied in I'd rather be tied in to the Nvdia ecosystem as they delivery every generation.

I mean the AMD people have been waiting a long time for a GPU to upgrade to which is at the top of the gaming charts.


With freesync monitors though you aren't tied into AMD. Nvidia cards work fine on Freesync monitors and 'Gsync compatible' - freesync monitors just means that they pass all the tests for VRR set out by Nvidia. You are only tied in on a G-sync monitor to Nvidia. Even many early Freesync monitors work with Nvidia cards
 
All for that, except the status quo mean the last part you mentioned has been opposite as gouging, scalping and scarcity means the costs are 10-20% above what they need to be at. Agree though in the main that this gen is better than the last gen for prices.

Anyone paying scalp prices are just idiots as far as I’m concerned. Even AIB’s are at it with £7-8-900 cards when they are only £649 from nvidia.
 
Anyone paying scalp prices are just idiots as far as I’m concerned. Even AIB’s are at it with £7-8-900 cards when they are only £649 from nvidia.

Yep, when the FE cards set the base, you dont mind maybe few percent markup if they have better coolers and features like extra DP. Thing is this gen the FE/reference coolers are probably the better than most AIB attempts so why pay £50~100+
 
Amd blew their load on 6000 series and still have no answer to deep learning, rt and largely productivity.

Yet, amd are asking prices similar to Nvidia. Why on earth would you go with AMD, objectively?
Indeed, look beyond raster performance, Nvidia have a much more comprehensive and forward looking solution. AMD are going to need another big jump with the next generation to stay in touch. Nvidia's architecture is good but suffers on Samsung's fab, there clockspeed and power consumption benefits still on the table for Nvidia.
 
Indeed, look beyond raster performance, Nvidia have a much more comprehensive and forward looking solution. AMD are going to need another big jump with the next generation to stay in touch. Nvidia's architecture is good but suffers on Samsung's fab, there clockspeed and power consumption benefits still on the table for Nvidia.


Raster performance is still King by far currently, by the time raytracing is mainstream and supported in the majority of games these cards will be a distant memory.
 
This, basically.

AMD are on par or a bit better on lower resolution rasterization.

4K or Ray Traced, Nvidia are fairly significantly ahead.
What are you both watching. 6800XT is well ahead of 3080 other than 4K. With SAM 6800XT is almost 3090 at 1440p and matches and edges out 3080 in some games at 4K.
 
the FE is £649 and has a decent cooler.

The AMD from what I’ve read has a pretty garbage stock cooler in comparison.

that is pure falsehood. The 6800XT runs cooler than 3080 FE. It is pegged at 70c in games where FE runs at 80c (given the same condition) The fans in 6800XT is quieter too. It is the dividend of lower power requirement and larger heatsink design.
 
What are you both watching. 6800XT is well ahead of 3080 other than 4K. With SAM 6800XT is almost 3090 at 1440p and matches and edges out 3080 in some games at 4K.

Starting to wonder as well, some games Nvidia win at 4k, some amd win. Yet the narrative on here is nvidia is better at 4k somehow when many reviews show wins and losses for each. :confused:
 
@4k without SAM 6800XT is on par with 3080. With SAM it is ahead albeit small margin. Anyone saying otherwise is making it up and just doing a bit of Nvidia fan boy propaganda.
 
What are you both watching. 6800XT is well ahead of 3080 other than 4K. With SAM 6800XT is almost 3090 at 1440p and matches and edges out 3080 in some games at 4K.

i'm really glad that AMD are competitive again, but you don't need to start relying on falsehoods. i've seen more benches than i could count and general consensus is the 3080 and 6800xt are a wash at pure raster in the 1440p middle ground, leaning towards the 3080 at 4k and the 6800xt at 1080p.

at 1440p the results seem to diverge more by test conditions than anything else, which is about as much "as a tie" as you can get, and not something that anyone who buys these cards to play games with (rather than point at an extra millimetre on a bar chart and shout about it like they've won the world cup) should concern themselves over.

Hoy7EfM.png.jpg hIcFZb9.png.jpg

voKR0nD.png.jpg

6eclAyb.png.jpg
 
i'm really glad that AMD are competitive again, but you don't need to start relying on falsehoods. i've seen more benches than i could count and general consensus is the 3080 and 6800xt are a wash at pure raster in the 1440p middle ground, leaning towards the 3080 at 4k and the 6800xt at 1080p.

at 1440p the results seem to diverge more by test conditions than anything else, which is about as much "as a tie" as you can get, and not something that anyone who buys these cards to play games with (rather than point at an extra millimetre on a bar chart and shout about it like they've won the world cup) should concern themselves over.

Hoy7EfM.png.jpg hIcFZb9.png.jpg

voKR0nD.png.jpg

6eclAyb.png.jpg
That’s interesting. I can also bring up similar charts from reputable reviewers that shows 6800XT is ahead
 
That’s interesting. I can also bring up similar charts from reputable reviewers that shows 6800XT is ahead

i'm sure you can, but just serves my point that the differences mainly come from test conditions. those were all the written benches i could find that had multi-game framerate averages (which is itself a bit of a pseudo-metric as one anomalous game can completely throw them out of whack).

my point is that anyone trying to shout that either the 3080 or 6800xt is "winning" at 1440p raster only is either misinformed or just trying to score points and not being honest. under those conditions they are about as close in performance as two completely different architectures on different nodes can be, it's uncanny really.
 
Last edited:
I think it is more AMD haven't been living up to their performance potential the last 2-3 generations - this was always coming once they put a bit more resources into it and got rid of some people who were holding things back IMO.

Generally AMD and nVidia have been more or less on performance parity over the years.

There is also a factor that 7nm right now is expensive so nVidia have given up a slight edge there with Samsung 8nm but that won't be the case next round - with both likely on very similar footing silicon wise where nVidia has generally had the designs and resource to eek out a little more.
This
 
Yeah good luck with that, the difference between 3080 and 3090 is hardly mind blowing (around 10-15%), so are they gonna put out a card in between those performance levels (about 7.5%)? They can't even supply enough of the 3 series they've already launched so any notions of a ti variant is currently just fantasy.

If they do not release 3080 Ti, how would they compete with 6900XT at £999? The supply issues are not as important to NVIDIA as retaining the performance crown.
 
@4k without SAM 6800XT is on par with 3080. With SAM it is ahead albeit small margin. Anyone saying otherwise is making it up and just doing a bit of Nvidia fan boy propaganda.
AMD are certainly very close but I’m afraid your comments in a few threads now are staring to sound like very Trump like.

The reviews tell it like it is.
https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/35.html

https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-6800-xt-review/21/

https://www.techradar.com/uk/reviews/amd-radeon-rx-6800-xt


https://www.google.co.uk/amp/s/www.techspot.com/amp/review/2144-amd-radeon-6800-xt/
 
I think it is more AMD haven't been living up to their performance potential the last 2-3 generations - this was always coming once they put a bit more resources into it and got rid of some people who were holding things back IMO.

Generally AMD and nVidia have been more or less on performance parity over the years.

There is also a factor that 7nm right now is expensive so nVidia have given up a slight edge there with Samsung 8nm but that won't be the case next round - with both likely on very similar footing silicon wise where nVidia has generally had the designs and resource to eek out a little more.
Looking at the transistors per mm2 7nm is only 5.4% better then Samsungs 8nm (comparing the 6800XT to the RTX 3080) so I'm not sure if the process lead is as a big of an advantage as we're lead to believe.

I think it just comes down to costs at the end of the day. I would imagine if Nvidia tried building the
Indeed, look beyond raster performance, Nvidia have a much more comprehensive and forward looking solution. AMD are going to need another big jump with the next generation to stay in touch. Nvidia's architecture is good but suffers on Samsung's fab, there clockspeed and power consumption benefits still on the table for Nvidia.
GA102 only has 5.4% fewer transistors per mm2 then Navi21 (RTX 3080 Vs 6800XT) I don't think the process node is that big of a deal. I suspect if you put GA102 on TSNCs 7nm process it would still top out at over 600mm2 (assuming you could get the yields to make it cost effective).
 
There was no performance advantage.

It's a game of transistor counts and fabrication sizes, which is determined by general technological advancements.

There may be no magic involved, but a substantial investment nevertheless has to be made.
 
It managed to make all Nvidia 3000 owners to claim they play every game at 4k with Ray Tracing enabled. And they will also play future games like that, even if we talk about multiplayer games where FPS is the king. :)
 
It managed to make all Nvidia 3000 owners to claim they play every game at 4k with Ray Tracing enabled. And they will also play future games like that, even if we talk about multiplayer games where FPS is the king. :)

Careful you are going to trigger many Nvidia fans on this forum and get a lot of fingers pointing at you calling you an AMD fanboi or shill etc.
 
Back
Top Bottom