• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Did Nvidia Blow Their Performance Advantage?

It's not a great way of looking at performance.

A large amount of the additional performance every generation comes from shrinking the transistor size on the chips as this typically leads to lower power draw, higher clock speed and more physical transistors to do calculations with, for any given chip area. Nvidia/AMD do not actually fabricate the chips themselves, that's done by the likes of TSMC and Samsung and other chip fabricators, they do the R&D into getting the node size down. Nvidia/AMD simply design what is etched into that wafer. Generally speaking they're both limited to using the same silicon as each other, and while they can have their own optimizations and tricks in software, fundamentally they both have the same performance ceiling every generation which is the number of transistors to do calculations with.

A lot of the performance gap being closed this generation has come at the expense of Nvidia spending less transistors on improving rasterization performance and spending more on RT and Tensor cores. These take up physical space on the GPU itself and so they're trading rasterization performance for cores which accelerates other more specific tasks. With RT cores they're doing ray tracing math and with tensor cores they're implementing machine learning algorithms do to do things like DLSS.

While AMD do have good performance vs Nvidia this generation at rasterization, they've been quiet about both Ray Tracing and resolution upscaling. We now know after 3rd party reviewers have done benchmarks that their RT performance is poor which should surprise no one, they do have RT acceleration units inside each CU but they're probably spending way less of their transistor budget on that. And there's no Tensor core equivalent, so any kind of up-scaling is bound to be poor and eat more into rasterization performance. They have Super Resolution in the works but we've not seen or heard anything from that yet, my guess is that it's not going to be very good compared to DLSS both in terms of performance and quality, but we'll see.
 
though it's hard to envision just being able to go into a shop and pick any of these cards up for MSRP for a long time.

that is basically what will drive price, not really feature sets. Both 3080, 6800, 6800XT are highly capable cards. So which ever gets in stock will get lots of sales.

3070 offers least value IMO. 8GB vram for £500 isn’t appealing and poor RT anyway
 
i'm really glad that AMD are competitive again, but you don't need to start relying on falsehoods. i've seen more benches than i could count and general consensus is the 3080 and 6800xt are a wash at pure raster in the 1440p middle ground, leaning towards the 3080 at 4k and the 6800xt at 1080p.

at 1440p the results seem to diverge more by test conditions than anything else, which is about as much "as a tie" as you can get, and not something that anyone who buys these cards to play games with (rather than point at an extra millimetre on a bar chart and shout about it like they've won the world cup) should concern themselves over.

Hoy7EfM.png.jpg hIcFZb9.png.jpg

voKR0nD.png.jpg

6eclAyb.png.jpg


What about 4k (probably the most popular resolution for these GPUs)
 
6800XT performance drops down to RTX3070/2080Ti or worse in games with heavy use of raytracing and doesn't have DLSS to help it.


I'd take RTX3080 over the 6800XT.

Not that I can buy either right now.
 
Is there not the danger that Pc gamers will be driven away altogether by these extremely poorly handled 'Launches'. For instance, I have not been able to get a suitable upgrade for my system for some time now and I am starting to question whether I now want to spend the kind of money they(being retailors or manufacturers) want for the new tech, new ryzen cpu and 6800xt/3080, its an awful lot of money, the xbox is attractively priced at £449, though cant get one of them either, for now! Why are computer parts retailors scalping there own customers as well? Have to admit I am getting sick of what's happening just now with Pc stuff and actually considering chucking it all together. There are plenty other things to spend money on these days!
 
I get the impression that a lot of nVidia's focus is in non-gaming applications.

That their GPUs can also be used to game is something of a bonus, these days :p

Tensor this, CUDA that.
 
Complacency, it's as simple as that. Fairly normal thing to happen in the business and technology world. Companies don't stay on top forever, see 3dfx and Nokia.

I am pretty hopeful AMD will try to get market share after Christmas, they only have 20% or so. They will likely wait for old stock of 5000 series cards to reduce and when the initial wave of demand for cards subsides. Then might see some better deals on new cards I hope. Going to be really hard to justify gaming PC if new consoles do essentially the same for 1/2 the money also.
 
A lot here bigged up RT in Turing as making it future proof but are first in line to buy Ampere. It was the same with the tessellation advantage with Fermi and yet some of the most vocal defenders on here quickly upgraded to Kepler. Suddenly power consumption isn't important anymore when it was a big advantage before. The same goes with VRAM.

Very true. Its interesting that the 10 series owners that held back are not so full of praise for the Turing/Ampere launches as they must feel the same. It does seem to point at the fanboy sect within the brand ownership that trump the narrative at any cost as its all they know.
 
6800XT performance drops down to RTX3070/2080Ti or worse in games with heavy use of raytracing and doesn't have DLSS to help it.


I'd take RTX3080 over the 6800XT.

Not that I can buy either right now.
I think there are still driver optimisations yet to be done for 4K and ray tracing by AMD as the performance drop is much more than warranted. For instance, look at the below game. The 6800 XT destroys the 3080 at 1440p and 1080p but performance falls off a cliff at 4k. It will be interesting to see how 6900 XT competes with RTX 3080 at 4k as that is the resolution these cards should be used at.
a5nkm3f6cd061.jpg
 
Basically, an architecture path takes a long time to design and go down, sometimes you don't find out until your committed that your next gen super sauce has issues or isn't going to last as long as you thought.

When company A catches up to or/and surpasses company B it's never just because company A did well, it's also because company B made a mistake or had a problem, to put it in perspective:

  • AMD didn't catch Intel in the early 00s just because Athlon XP and it's successors were good, it was also because Pentium 4/D was such a failure that they had no choice but to go back to Pentium III and fork development in a new direction for "Core" architecture.
  • ATi didn't catch Nvidia in the early 00s just because Radeon 9000 and it's successors were good, it was also because the Geforce 4 series was just a lineup of enhanced Geforce 2/3 cards designed as a stop gap for Geforce FX, and because FX turned out to be a disappointment
  • Intel didn't pull ahead from AMD a decade ago just because Core i3/5/7 were good, it was also because the AMD FX architecture was such a failure that they had no choice but to abandon it after two generations and go back to the drawing board.

Nvidia gambled big that investing resources in features like RTX, DLSS, etc would be such a massive game changer that AMD not having them would make customers view their products as flat out inferior regardless of what the FPS charts said (similar to how their hardware lighting and 32 bit colour support advantage over 3DFX had turned the #1 brand into a 3rd tier budget choice almost overnight despite being competitive on performance).
 
What about 4k (probably the most popular resolution for these GPUs)
There are actually a lot more perspective customers for these cards with 100Hz+ 1080p/1440p screens than 4K60 ones.

The general trend over the past few years has been early adopter 4K users dropping to lower resolution monitors to gain smoother gameplay at the expense of a few pixels.
 
I agree on the recent pricing by AMD but it seems AMD,Intel and Nvidia pricing is opportunistically high to see if it will stick. With all these products selling out it seems enough people don't care! :(
This is my thinking. Tbh I always said as a main hobby that it was quite cheap to maintain a top end machine. £20 per week gives me roughly 1500 in an18 month cycle to go with any funds raised from selling the previous high end parts. Which in the past left me with plenty left over.
I always knew that I would be prepared to pay more of that 18 month pot if prices went up, and I guess there are a lot in the same boat.
Do I like it? No. Do I still buy without remorse with the new higher prices? Yes.
Unfortunately, I think the mining craze and these shortages have shown that people are still prepared to buy. So I'm guessing these "rrp" prices are going to stick.
Lockdown further compounded the issue and have shown people will pay a lot more than rrp, but out of the current situation I couldn't see rrp at those prices selling. So I will stay with the current rrp prices to stick for the long term.
 
Back
Top Bottom