• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

(Computerbase.de) Fine wine drivers and performance over time - 6800XT vs 3080

Soldato
Joined
31 Dec 2010
Posts
2,690
Location
Sussex
Computerbase have a new article about what performance AMD and Nvidia have been able to get out of their drivers since the release of the 6800XT and the 3080:
https://www.computerbase.de/2022-03/geforce-radeon-treiber-benchmark-test/
(https://www-computerbase-de.transla...uto&_x_tr_tl=en&_x_tr_hl=en-US&_x_tr_pto=wapp)

Some highlights:
sKPYwFP.png
cukf9tN.png
So for Average FPS at Raster, AMD managed to squeeze another 10% compared to launch drivers vs 3% for Nvidia. Min FPS (Rasterrizer, Perzentil-FSP) the same.
RT min FPS:
BRsebDa.png
RFYroIC.png
the figures are +15% and +6%.
 
I thought the graphs show performance improvement as a percentage from driver release so what are you referring to in your general sweeping statement ? It actually shows the AMD driver gave the biggest improvement since release.
Yes, isn't that what I said?

Compared to launch drivers, AMD managed to gain 10% from their driver improvements while Nvidia managed to gain another 3%.

Guess two screenshots beside each other would be better:
4AHkvKn.png
 
Nvidia still better. The new 3080 12GB actually even beats the 6900XT in rasterization.
Power usage is pretty poor of the 12GB 3080 though. TPU had it at near 400W. That doesn't matter to some people (although ironically in previous generations it was very hyped).

Would be expensive with prices as crazy as they are, but what interests me more is taking a high end card (so GA102 or Navi21) and clocking it lower to hit the perf/watt sweetspot. If it wasn't for the crazy prices, taking a 300W card an running it under 200W while only loosing 5-10% performance appeals to me.
 
Tells me AMD screwed up the release driver?

I remember finewine(tm) used to also mean AMD cards being better for yet unreleased games, thanks to more advanced architecture.

It also had something to do with not having the budget to release separate gaming and compute chips. Now that they have a larger budget and design RDNA and CDNA separately I imagine this will be less of a thing.

It was in response to post #2 the forum member who has got it stuck on big letters.
Sorry, I though it was my opening post. It's nice that this forums previews the images but it can make it hard to read your own what you're typing. Or at least that's my excuse for any typos!
 
As above, Nvidia get less of an improvement because the drivers they have are better to start with so there is less to be made through driver optimisations.

Thats how I read it..
They are by far the bigger company for GPUs (80% vs 20% or worse for AMD), and they do polish their drivers more before a release. On the other hand with a 80% marketshare, they do have to compete with themselves for new sales. That doesn't necessarily mean building in some planned obsolescence but it can be a factor. Certain of Nvidia's generations mainly aged badly because they skimped on VRAM or bandwidth, but while 16GB on the 6800 and 12GB on the 6700 was more generous on the AMD side (and 8GB on the 3070 and 10GB on the 3080 was far less so), against that this gen AMD really went low(er) bus-width and VRAM on the 6600 and 6500s.

With an 80% marketshare and more game sponsorships, Nvidia are also more likely to have release day drivers - despite the consoles. Sponsorship and pushing certain techs have used and abused to distort the market for years though.

Have to say that the Nvidia control panels are pretty poor compared to AMD ones though.
 
This thread does seem to have trigger a lot of Nvidia defenders! That wasn't the intention.

No one will even care that much outside of owners of said cards who are stubborn or can't afford RDNA3 / Lovelace.
Well, I've been of the opposite view for years - I want and expect my cards to last for years and don't feel compelled to upgrade to the latests all the time.

I remember when Nvidia sold all those millions of bad solder parts (arguably their biggest mistake ever, but they got off very lightly aside from loosing Apple as a customer for good). While I was pretty upset about it the vibe I got from the Nvidia defenders was that cards like my 8800GT were already over two years old at that time and that all real hardware "fans" would have upgraded ages ago; ergo, it wasn't a real issue. A very strange attitude.

AMD RDNA2 is already really crap where it matters right now.. Ray Tracing, it's DOA even if drivers can get even another 10%.
Despite scoring the 3050 at MSRP, I haven't actually tried any ray tracing yet.

Thing is, I play a lot of older modded games like Skyrim and the modlist now mostly come and are designed for those after game engine mods like ENB. And they tend to turn up the effects like crazy. So I have to spend ages turning lots of these effects off (for instance I find depth of field is nauseating, and other effects are OTT too). And after all that, everything is still too dark. I get it it is supposed to be more realistic like how a cameraman would be very proud about good shots in the dark but while I can appreciate it is technically hard I watch films/play games to enjoy myself not to squint at a screen which is 90% black.

Now, raytracing doesn't have to look like someone let loose with a full movie effects unit but I'm staying clear for now. So, for me, rasterisation performance is what matters the most.
 
Back
Top Bottom