• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The AMD Navi Thread **

Off the bat, they have done changes to the driver default settings.
a) Tesselation mode from the default setting "AMD optimised" to "Application Default". That alone drops performance on any AMD cards without affecting image quality especially on "Shitworks" games.
b) Texture Filtering Quality from the default "Standard" to "High".
c) Surface Format Optimization from "ON" to "OFF". That setting improves greatly memory performance and memory usage, without impact on graphic quality.


In addition
We used Afterburner to set all GeForce’s highest Power and Temperature targets to maximum. By setting the Power Limits and Temperature limits to maximum, they not throttle, but it can reach and maintain its individual maximum clocks. This is particularly beneficial for almost all higher power cards.
....

WattMan caused instability that was also noted by multiple posters on Reddit’s r/AMD. Eventually, we had to do a clean driver install and did not open WattMan to maintain stability. We used Afterburner to downclock by 75MHz to simulate a reference RX 5700 XT, but we still had to increase the fan speed to 40% to eliminate the thermal throttling that occurred with the stock fan settings.


Also missing DX11 games.

Untitled-2-4.jpg

Untitled-3-6.jpg

Untitled-4-6.jpg
 
Off the bat, they have done changes to the driver default settings.
a) Tesselation mode from the default setting "AMD optimised" to "Application Default". That alone drops performance on any AMD cards without affecting image quality especially on "Shitworks" games.
b) Texture Filtering Quality from the default "Standard" to "High".
c) Surface Format Optimization from "ON" to "OFF". That setting improves greatly memory performance and memory usage, without impact on graphic quality.


In addition


Also missing DX11 games.

Untitled-2-4.jpg

Untitled-3-6.jpg

Untitled-4-6.jpg

I'm not sure why they would change the default settings, i thought a few of those results looked odd. :facepalm.gif
 
@Phate got response from EK today in relation to the Pascal aluminium block
Unfortunately, the layouts seem to be the same, but actually, they are not as the height of the parts on the PCB is different by the Pascal build, so you need to be careful when choosing the block, so go only with the compatible one from this list

Update. Ok the price is just $20 more over the normal Acetal + backplate and around $12 over the acrylic one.

https://www.techpowerup.com/257414/ek-launches-ek-vector-special-edition-rx-5700-series-water-blocks
 
Last edited:
Since when were driver "optimisations" acceptable for benchmarks

They are the default settings. Similarly to Nvidia default settings.
You cannot leave one vendor to default and the other you reduce the settings.

Is like me going to Nvidia drivers and start setting everything to high for no reason. Yet they left the NV drivers to default except the power which is set to maximum performance from Balanced.
 
Since when were driver "optimisations" acceptable for benchmarks

Since when wasn't any hardware optimisation acceptable?

All manufacturers optimisation the hardware for day to day tasks out of the box.

What nvidia does out of the box is different to AMDs out of the box doesn't make either wrong.

All settings should be left at the default settings the manufacturer see fit.

That goes for anything you buy.
 
Since when wasn't any hardware optimisation acceptable?

All manufacturers optimisation the hardware for day to day tasks out of the box.

What nvidia does out of the box is different to AMDs out of the box doesn't make either wrong.

All settings should be left at the default settings the manufacturer see fit.

That goes for anything you buy.

In the old benchmark threads here it was stipulated the driver optimisations were removed.

They've always been considered "Cheating" by lowering IQ to give performance gains.
 
In the old benchmark threads here it was stipulated the driver optimisations were removed.

They've always been considered "Cheating" by lowering IQ to give performance gains.

Yet all of the settings they changed have no discernible change in IQ but can cause a big performance hit.

It is not as clear cut as saying "everything should be set to the same value in both"

What is determined as high in one application is not necessarily the same as high in another.
 
In the old benchmark threads here it was stipulated the driver optimisations were removed.

They've always been considered "Cheating" by lowering IQ to give performance gains.

While most of the features nvidia have been releasing lately do just what you saying.
DLSS, Adaptive shader.

But amd have optimisation in driver like Nvidia now it's cheating?
Keeping in mind these do not affect image quality.
 
The idea is to disable all driver optimisations on BOTH vendors in order to get an as apples-to-apples comparison as possible. Whether it works that way in practice is not clear; after all, we don't know what the drivers really do in the background. So it's hard to say if it works as intended or not.

Regardless, I have done tests myself with & without, on AMD GPUs, and the differences were more or less inconsequential. So I have no issues with trusting BabelTR's numbers.
 
The idea is to disable all driver optimisations on BOTH vendors in order to get an as apples-to-apples comparison as possible. Whether it works that way in practice is not clear; after all, we don't know what the drivers really do in the background. So it's hard to say if it works as intended or not.

Regardless, I have done tests myself with & without, on AMD GPUs, and the differences were more or less inconsequential. So I have no issues with trusting BabelTR's numbers.

Only real setting that does change performance is tessellation.
 
The idea is to disable all driver optimisations on BOTH vendors in order to get an as apples-to-apples comparison as possible. Whether it works that way in practice is not clear; after all, we don't know what the drivers really do in the background. So it's hard to say if it works as intended or not.

Regardless, I have done tests myself with & without, on AMD GPUs, and the differences were more or less inconsequential. So I have no issues with trusting BabelTR's numbers.

You forget the obvious. These are the default settings AMD is operating, while he kept most of those things OFF on Nvidia cards.
AMD tesselation optimization doesn't tank the performance (mainly) in Gimpworks games to boot and the Surface Format Optimization improves how the memory handles textures.
Both these settings on Default they do not affect image quality. Yet this review changed the settings, including the texture Filtering quality to High. (which has around 1-2% impact)

Yet the guy set to OFF almost everything Nvidia has on the drivers, and put the cards from the default "Balanced" power mode to "Performance" power mode. On top in AB crack up power limits, voltages and the thermal throttling cap.
So basically he didn't run the Nvidia cards out of the box but overclocked (relying on Boost curve). The only thing did on the 5700XT was to set the card to 40% fan speed max as the default setting is very low.

And this is obvious from the results. Look at The Division 2 performance at 2560x1440. The best review on 2070S shows 80fps (Hexus). All other reviews have the 2070S doing between 75-78. This review shows 85!
With the same drivers and same setup (8700K @ 4.8Ghz - Hexus, TB or 9900K @ 5Ghz Anandtech, Legitreviews). Want me to start writing about the other games and reviews?

He basically tried to run overclocked Nvidia cards against the 5700XT AE. (his 5700XT results are based on the 5700XT AE with -75mhz clock).
 
Back
Top Bottom