• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Did Nvidia Blow Their Performance Advantage?

It also depends how you look at it. Right now Nvidias rayvtracing performance is over double what AMD have, so has Nvidia really lost their crown?

If all you care about is raster load then there might be an argument but that is by choice - Nvidia chose to dedicate a large part of their die to RT and Tensor cores and they chose to make their cards on 10nm instead of the double transistor density 7nm AMD needs just to catch up
 
Agreed with most. They focused mainly on RTX/DLSS which they are still well ahead in big titles. So they are still very relevant especially considering that the FE is £649 and has a decent cooler.

The AMD from what I’ve read has a pretty garbage stock cooler in comparison.

The only thing that is letting them down is the 8nm which is pretty power inefficient compared to rdna 2.
 
Nvidia have been sitting way out in front for the last few gens, why have they let that massive performance gap more or less disapear overnight?

Did they simply underestimate Amd's ability to come back from literally nowhere?

Have they pushed up profits and margins to the point that they are relying on us snapping up anything available to purchase no matter what?

Or did they just get arogent and cocky?

All of the above.

They didnt obviously learn for seeing intel get rumbled where it hurts.

One thing will be maybe a nice acquisition is if you seen the M1 from apple regarding the ARM chips, they may have some killer combo in the future if that gets traction.
 
Dont think so Grim, they just got caught napping and picked a poor node when they went with Samsung. Is rayvtracing the next iteration? :p

Not necessarily the 5700XT only had 2560 shaders and matched the 2070/S so it was pretty obvious an AMD card with rdna 2 improvements with a 4-5000 shader count could match a 3080 or beat it.

They probably new it wouldn’t match them on RTX/DLSS though.

If anything it means we get fast and cheaper cards on both sides.
 
Last edited:
Just going to throw this out here, but why are we assuming that Nvidia blew their lead? In isolation Ampere is an impressive performance increase over turing.

Did Nvidia blow their lead or did a focussed, non cash strapped AMD simply make a momentus comeback?

If this is AMD making a comeback, what else can they achieve in future with this now focus they have? Well we will have to wait and see.
 
I think Nvidia released an excellent chip in the 3080 and credit to Amd who very nearly matched it.

it would take Nvidia to release a 4080ti tomorrow to reinstate the performance gulf that once was. Can’t we just be happy that there is finally some competition in the sector?
 
Not necessarily the 5700XT only had 2560 shaders and matched the 2070/S so it was pretty obvious an AMD card with rdna 2 improvements with a 4-5000 shader count could match a 3080 or beat it.

They probably new it wouldn’t match them on RTX/DLSS though.

If anything it means we get fast and cheaper cards on both sides.

All for that, except the status quo mean the last part you mentioned has been opposite as gouging, scalping and scarcity means the costs are 10-20% above what they need to be at. Agree though in the main that this gen is better than the last gen for prices.
 
When your out in front for so long you get complacent same as Intel did.

Came in to say this - plus, in the absence of any meaningful competition why keep pushing for bigger leaps forward in performance when they can keep releasing minor improvements and charge whatever they want? AMD would probably do the same if the situation were reversed so ongoing parity (or near it) is great for us customers.
 
saying that nvidia "blew their lead" is a very reductive mindset. pure rasterisation is a time honoured conservative approach to real time rendering. it does the job, but essentially you're stuck using limited techniques. there's a reason graphical leaps have stagnated so much and you can play a five year old game and it'll look almost as good as a brand new one.

we're reaching the limit of pure rasterisation and all its tricks, there's a reason all high end CGI has been using Global Illumination for decades now. ray tracing, global illumination and eventually path tracing is necessary for the next step and AI reconstruction techniques like DLSS will be required to get us there.

AMD have gone all out on the traditional methods and if you're only interested in pure rasterised performance per pound (and watt) then they'll have you covered., Nvidia have taken a more future-facing approach and for people who want to get in on the ground floor, the tech has matured to the point where it's possible without massive sacrifices (unlike the 2xxx series).

it's a great time to be buying a card whichever mindset you subscribe to, if only we could actually buy them.
Very good post imo.
 
They didn’t though. Once they release the 3080 Ti in Jan 2021 they retain the performance crown.

I dont subscribe to the solution mindset on this as they will probably be equally sought after so almost as hard to buy up, then you will have the guys selling their 3080s to 'upgrade' as they dont want to feel inferior again so quickly, then the scalping and retailer gouging its again going to be way more than the FE version..
 
They didn’t though. Once they release the 3080 Ti in Jan 2021 they retain the performance crown.

Yeah good luck with that, the difference between 3080 and 3090 is hardly mind blowing (around 10-15%), so are they gonna put out a card in between those performance levels (about 7.5%)? They can't even supply enough of the 3 series they've already launched so any notions of a ti variant is currently just fantasy.
 
To me it seems Nvidia is more focused on datacentres and will repurpose that architecture to an extent for gaming. Heck in the GA102 white paper first thing they talk about is datacentre applications amongst the gaming for GA102. This has paid dividends for them as they make more money now in the datacentre arena then gaming, so makes absolute sense to focus there priority there and up until now having no competition in the desktop market where there offering were holding the performance crown anyways. Hardware did a good review showing how it can be hard to keep all those cores fed on the 3xxx, why as you increase resolution, ampere comes into its own. Would also throw on at the prices right now its not quiet a whitewash anyways, would argue in case of the 6900XT and 3080 the latter offers some compelling features and agree largely with what hambucker said.

With all that said as an enthusiast (I would like to think) absolutely great news that AMD has come out with something amazing in their own right. It is fantastic to see AMD come from so far behind to offer something as competitive as they have, implement some features like SAM which now Nvidia following on.

They didn’t though. Once they release the 3080 Ti in Jan 2021 they retain the performance crown.

They will be hard to buy. Performance numbers of the Ti model are not an unknown quantity really, take a look at the current 3090 and knock 1%-2% off.

Yeah good luck with that, the difference between 3080 and 3090 is hardly mind blowing (around 10-15%), so are they gonna put out a card in between those performance levels (about 7.5%)? They can't even supply enough of the 3 series they've already launched so any notions of a ti variant is currently just fantasy.

Don't think it will be as hard as people think. At this point the 3080Ti will just use the 3090 cores. Can see 3090 sales slowing partly when FE models go up are available for hours. looking around can also see can grab a 3090 right now from retailers including OCUK who have 3 SKU's with 10+ available. Those retailers who list backlogs, can see the backlogs on some models are non existent and they are getting delivered Sure some SKU's are on back order, but its going down by a lot. Eventually will be a point where those willing to splash out so much on a 3090 which offers marginal performance over a 3080 will have one and Nvidia will have complete cores cores that rather then going into a 3080, go into a higher margin 3080Ti. just my two cents.
 
Amd blew their load on 6000 series and still have no answer to deep learning, rt and largely productivity.

Yet, amd are asking prices similar to Nvidia. Why on earth would you go with AMD, objectively?

The 6000 series is a gaming card, vega was pulling double duty being a gaming\prosumer card soo obviously the 6k isn't as good at things vega was. And how have they no answer to RT when the 6k series supports it? The speed isn't as good currently but that can improve over time, and it's not like there is an assload of games currently out supporting the feature. Said it before and i'll say it again, rt is Niche at best despite being a feature on the nvidia cards for 2 years now. It will gain traction eventually but right now compared to traditional rendering it's a fraction of a fraction of a percent in terms of games that support it vs games that don't.
 
Yeah good luck with that, the difference between 3080 and 3090 is hardly mind blowing (around 10-15%), so are they gonna put out a card in between those performance levels (about 7.5%)? They can't even supply enough of the 3 series they've already launched so any notions of a ti variant is currently just fantasy.
If I may, the performance vis-à-vis 3080 and 6800XT and by extension 6900XT is more than mere 7.5%.

Nvidia have invested heavily in productivity and deep learning including DLSS. In fact, Nvidia are still in early, but very promising doors in DLSS.

AMD did well here but let's not kid ourselves.
 
[QUOTE="Gerard, post: 34244125, member: 188"]The 6000 series is a gaming card, vega was pulling double duty being a gaming\prosumer card soo obviously the 6k isn't as good at things vega was. And how have they no answer to RT when the 6k series supports it? The speed isn't as good currently but that can improve over time, and it's not like there is an assload of games currently out supporting the feature. Said it before and i'll say it again, rt is Niche at best despite being a feature on the nvidia cards for 2 years now. It will gain traction eventually but right now compared to traditional rendering it's a fraction of a fraction of a percent in terms of games that support it vs games that don't.[/QUOTE]

That it is. Im not denying its a bad card. Far from it.

Regarding Ray Tracing, I'm not looking support im looking tangible results.
 
Don't worry Intel will be along around about February to sell us a card that's within 5% of the 3080 and 6800XT and at £675 MSRP. Long live 'competition'!
 
Back
Top Bottom