• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

(Computerbase.de) Fine wine drivers and performance over time - 6800XT vs 3080

Soldato
Joined
31 Dec 2010
Posts
2,682
Location
Sussex
Computerbase have a new article about what performance AMD and Nvidia have been able to get out of their drivers since the release of the 6800XT and the 3080:
https://www.computerbase.de/2022-03/geforce-radeon-treiber-benchmark-test/
(https://www-computerbase-de.transla...uto&_x_tr_tl=en&_x_tr_hl=en-US&_x_tr_pto=wapp)

Some highlights:
sKPYwFP.png
cukf9tN.png
So for Average FPS at Raster, AMD managed to squeeze another 10% compared to launch drivers vs 3% for Nvidia. Min FPS (Rasterrizer, Perzentil-FSP) the same.
RT min FPS:
BRsebDa.png
RFYroIC.png
the figures are +15% and +6%.
 
I thought the graphs show performance improvement as a percentage from driver release so what are you referring to in your general sweeping statement ? It actually shows the AMD driver gave the biggest improvement since release.

So for Average FPS at Raster, AMD managed to squeeze another 10% compared to launch drivers vs 3% for Nvidia.
 
Persue in hilarity though that this has always been the case.

Nvidia has a superior driver team so most performance is there from the get go where we have to wait an unknown amount of time for AMD to catch up and deliver what the card was supposedly capable of.

Add RT on top and well it is not quite fine wine no more.

But RDNA3 may show a different story with RT.. well one can hope.
 
Tells me AMD screwed up the release driver?

I remember finewine(tm) used to also mean AMD cards being better for yet unreleased games, thanks to more advanced architecture.
 
I thought the graphs show performance improvement as a percentage from driver release so what are you referring to in your general sweeping statement ? It actually shows the AMD driver gave the biggest improvement since release.
Yes, isn't that what I said?

Compared to launch drivers, AMD managed to gain 10% from their driver improvements while Nvidia managed to gain another 3%.

Guess two screenshots beside each other would be better:
4AHkvKn.png
 
Nvidia still better. The new 3080 12GB actually even beats the 6900XT in rasterization.
Power usage is pretty poor of the 12GB 3080 though. TPU had it at near 400W. That doesn't matter to some people (although ironically in previous generations it was very hyped).

Would be expensive with prices as crazy as they are, but what interests me more is taking a high end card (so GA102 or Navi21) and clocking it lower to hit the perf/watt sweetspot. If it wasn't for the crazy prices, taking a 300W card an running it under 200W while only loosing 5-10% performance appeals to me.
 
Tells me AMD screwed up the release driver?

I remember finewine(tm) used to also mean AMD cards being better for yet unreleased games, thanks to more advanced architecture.

It also had something to do with not having the budget to release separate gaming and compute chips. Now that they have a larger budget and design RDNA and CDNA separately I imagine this will be less of a thing.

It was in response to post #2 the forum member who has got it stuck on big letters.
Sorry, I though it was my opening post. It's nice that this forums previews the images but it can make it hard to read your own what you're typing. Or at least that's my excuse for any typos!
 
we have to wait an unknown amount of time for AMD to catch up and deliver what the card was supposedly capable of.

How is that a bad thing? It's great for consumers because the price is based on performance at launch. So when AMD fix the poorly optimised drivers after a few years, that's a free performance boost that you didn't originally pay for.

AMD looses out though because they could have sold the card for a higher price if it performed better.
 
Power usage is pretty poor of the 12GB 3080 though. TPU had it at near 400W. That doesn't matter to some people (although ironically in previous generations it was very hyped).

Would be expensive with prices as crazy as they are, but what interests me more is taking a high end card (so GA102 or Navi21) and clocking it lower to hit the perf/watt sweetspot. If it wasn't for the crazy prices, taking a 300W card an running it under 200W while only loosing 5-10% performance appeals to me.


The reason I have the 6800 is the performance per watt. The AMD software makes it really easy to make adjustments and with Radeon chill I can set a fps target for each game which if I lower it down really helps with power usage
 
How is that a bad thing? It's great for consumers because the price is based on performance at launch. So when AMD fix the poorly optimised drivers after a few years, that's a free performance boost that you didn't originally pay for.

AMD looses out though because they could have sold the card for a higher price if it performed better.
A few years of some slight improvements as games get harder and harder to run.

So essentially like gambling.

When the card was meant to shine it is already outpaced and defeated by better cards that released after.... gotta have that win though!.
 
Computerbase have a new article about what performance AMD and Nvidia have been able to get out of their drivers since the release of the 6800XT and the 3080:
https://www.computerbase.de/2022-03/geforce-radeon-treiber-benchmark-test/
(https://www-computerbase-de.transla...uto&_x_tr_tl=en&_x_tr_hl=en-US&_x_tr_pto=wapp)

Some highlights:
sKPYwFP.png
cukf9tN.png
So for Average FPS at Raster, AMD managed to squeeze another 10% compared to launch drivers vs 3% for Nvidia. Min FPS (Rasterrizer, Perzentil-FSP) the same.
RT min FPS:
BRsebDa.png
RFYroIC.png
the figures are +15% and +6%.


Hardware unboxed already tested this over 50 games, there is no difference overall no fine wine or advantage amd has, yet
 
Isn't this premature? AMD drivers used to take a couple of years to mature like fine wine.
No one will even care that much outside of owners of said cards who are stubborn or can't afford RDNA3 / Lovelace.

AMD RDNA2 is already really crap where it matters right now.. Ray Tracing, it's DOA even if drivers can get even another 10%.

Screenshot-2022-03-16-at-20-34-25-Nvidia-RTX-4000-GPUs-release-date-price-specs-and-benchmarks.png
 
No one will even care that much outside of owners of said cards who are stubborn or can't afford RDNA3 / Lovelace.

AMD RDNA2 is already really crap where it matters right now.. Ray Tracing, it's DOA even if drivers can get even another 10%.

Screenshot-2022-03-16-at-20-34-25-Nvidia-RTX-4000-GPUs-release-date-price-specs-and-benchmarks.png

Ironically your statement is completly against your banner quote
 
Ironically your statement is completly against your banner quote
It is?

Objective understanding of a GPU is having more worth over material items than my own self?

Interesting logic.

I guess if you cannot fight me in a 1 on 1 battle in debate. Bring something else up and misunderstand it's meaning altogether.

Do you know what balance is?
 
As above, Nvidia get less of an improvement because the drivers they have are better to start with so there is less to be made through driver optimisations.

Thats how I read it..
 
As above, Nvidia get less of an improvement because the drivers they have are better to start with so there is less to be made through driver optimisations.

Thats how I read it..
They are by far the bigger company for GPUs (80% vs 20% or worse for AMD), and they do polish their drivers more before a release. On the other hand with a 80% marketshare, they do have to compete with themselves for new sales. That doesn't necessarily mean building in some planned obsolescence but it can be a factor. Certain of Nvidia's generations mainly aged badly because they skimped on VRAM or bandwidth, but while 16GB on the 6800 and 12GB on the 6700 was more generous on the AMD side (and 8GB on the 3070 and 10GB on the 3080 was far less so), against that this gen AMD really went low(er) bus-width and VRAM on the 6600 and 6500s.

With an 80% marketshare and more game sponsorships, Nvidia are also more likely to have release day drivers - despite the consoles. Sponsorship and pushing certain techs have used and abused to distort the market for years though.

Have to say that the Nvidia control panels are pretty poor compared to AMD ones though.
 
Back
Top Bottom