• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Did Nvidia Blow Their Performance Advantage?

Soldato
Joined
30 Mar 2010
Posts
13,258
Location
Under The Stairs!
Nvidia have been sitting way out in front for the last few gens, why have they let that massive performance gap more or less disapear overnight?

Did they simply underestimate Amd's ability to come back from literally nowhere?

Have they pushed up profits and margins to the point that they are relying on us snapping up anything available to purchase no matter what?

Or did they just get arogent and cocky?

I'm feeling their response to Sam with Bar mirrors what Amd said with G-Sync, ~it's in the Pcie spec and they can do it too.

So where has it all went wrong with Nvidia?

Or is it just business for usual with Nvidia nothing to see?
 
Honestly I think the consoles hold PC / NVIDIA back. The last gen of consoles really skimped on the hardware. There is a lot of money tied up in console development between XBOX and Sony. It's only recently that Devs have even realised the value of selling their product to the PC. The number of people with 4k 60 capable PCs are also tiny. There isn't profit in purely focusing on making the best 4k version of a game yet.
 
I suspect they gambled or were too complacent about being the leaders that they didn't really need to try that hard and therefore could choose cheaper parts and max out margins. If they hadn't been met with RDNA2 I'd have expected NV to have peddled out Ampere at much higher prices and would have succeeded at their goal.
 
Or you could give AMD some credit.

Was clear long ago that GCN couldn't do it anymore and they abandoned enthusiast tier to work on RDNA... Now it's here and they've proved they can can start from scratch and still get back to the top.
 
I think it is more AMD haven't been living up to their performance potential the last 2-3 generations - this was always coming once they put a bit more resources into it and got rid of some people who were holding things back IMO.

Generally AMD and nVidia have been more or less on performance parity over the years.

There is also a factor that 7nm right now is expensive so nVidia have given up a slight edge there with Samsung 8nm but that won't be the case next round - with both likely on very similar footing silicon wise where nVidia has generally had the designs and resource to eek out a little more.
 
Nvidia have 3 systems working as one on their GPUs Cuda, RT and Tensor. I don't think they did blow performance as we can already see how far ahead of AMD they are with RT, ML and 4k+ rasterisation benchmarks, while on a poorer node.
 
Nvidia have 3 systems working as one on their GPUs Cuda, RT and Tensor. I don't think they did blow performance as we can already see how far ahead of AMD they are with RT, ML and 4k+ rasterization benchmarks, while on a poorer node.

This, basically.

AMD are on par or a bit better on lower resolution rasterization.

4K or Ray Traced, Nvidia are fairly significantly ahead.
 
It's good job AMD. They have definitely over delivered this time and really impressed beyond expectations.

I don't think Nvidia expected it to be honest. No one did.

It's excellent for the community to have two rivals trading blows.

However the key will be consistency. Nvida have that history of being consistent, AMD do not so that's the next hurdle for AMD, can they do it next time round and the time after that etc...

This is why I am wary buying in to an AMD ecosystem with a FreeSync monitor. AMD do not have the historical record behind them when it comes to pushing the GPU boundaries every generation. If I am going to be tied in I'd rather be tied in to the Nvdia ecosystem as they delivery every generation.

I mean the AMD people have been waiting a long time for a GPU to upgrade to which is at the top of the gaming charts.
 
saying that nvidia "blew their lead" is a very reductive mindset. pure rasterisation is a time honoured conservative approach to real time rendering. it does the job, but essentially you're stuck using limited techniques. there's a reason graphical leaps have stagnated so much and you can play a five year old game and it'll look almost as good as a brand new one.

we're reaching the limit of pure rasterisation and all its tricks, there's a reason all high end CGI has been using Global Illumination for decades now. ray tracing, global illumination and eventually path tracing is necessary for the next step and AI reconstruction techniques like DLSS will be required to get us there.

AMD have gone all out on the traditional methods and if you're only interested in pure rasterised performance per pound (and watt) then they'll have you covered., Nvidia have taken a more future-facing approach and for people who want to get in on the ground floor, the tech has matured to the point where it's possible without massive sacrifices (unlike the 2xxx series).

it's a great time to be buying a card whichever mindset you subscribe to, if only we could actually buy them.
 
Nvidia have been sitting way out in front for the last few gens, why have they let that massive performance gap more or less disapear overnight?

Did they simply underestimate Amd's ability to come back from literally nowhere?

Have they pushed up profits and margins to the point that they are relying on us snapping up anything available to purchase no matter what?

Or did they just get arogent and cocky?

I'm feeling their response to Sam with Bar mirrors what Amd said with G-Sync, ~it's in the Pcie spec and they can do it too.

So where has it all went wrong with Nvidia?

Or is it just business for usual with Nvidia nothing to see?

Because they are not using tsmc 7nm
 
I don't really see that they have blown their advantage, more that they allowed the raster aspect to come second to the development of RT/Tensor and its integration into the single package. I think that jump into hardware RT comes at a price somewhere. nVidia paid the price in rasterisation for the 2xxx series allowing AMD to catch up, and AMD are paying in the 68xx series with their RT not matching nVidias.

In terms of RT, nVidia still have what looks like a notable performance advantage. Given the general drifting trend towards RT, the cost of dropping rasterisation performance across a generation could well be worth it long term.
 
Back
Top Bottom