• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

HDR benchmark thread

Soldato
Joined
25 Nov 2011
Posts
20,668
Location
The KOP
First off its still early days in the PC gaming HDR world. At the same time reading this I didn't expect they was a performance pen for using HDR. AMD seems to be doing a better job at this moment.

Surprising results do not change the conclusion
When planning the article, the editors actually assumed that the results would be boring - because the same - and the expectations would only be confirmed. It was then but much more exciting than expected. Because in terms of performance, it seldom matters whether HDR is used or not. Some games lose quite a bit of performance depending on the graphics card.

Depending on the game, there is a different power loss
With HDR must be differentiated at present between a AMD and a Nvidia graphics card. Because the used in the test Radeon RX Vega 64 loses in most titles actually no performance when switching from SDR to HDR. The worst case is Destiny 2 with less than ten percent less FPS and Far Cry 5 with a five percent loss. On average, the use of HDR on the Radeon with two percent loss is not noticeable.

On a Nvidia graphics card, the power loss is currently much higher. The GeForce GTX 1080 loses an average of 10 percent of images per second with "High Dynamic Range", while in Destiny 2 it is even 20 percent. Thus, the loss of speed on a Nvidia graphics card unlike on an AMD accelerator is quite noticeable. And it also shifts the performance structure: Because with SDR there is a tie between the two graphics cards, with HDR but AMD is the front.

The reason for the behavior is unclear
Why Nvidia GPUs have to contend with greater losses than the AMD graphics cards, is currently unclear - even Nvidia could not call the editors any reason. Nevertheless, the recommendation is to use HDR with a corresponding monitor. There are not many games that support HDR yet. However, most of them do look much better. The difference is much bigger than one or two higher levels of detail, so you can reduce them and make the game look more beautiful. However, you may need to reduce more details on a GeForce than on a Radeon.

]





https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/
 
Last edited:
Results from over a year ago 480 vs 1060 shows simlar results why does Nvidia have such a performance hit from HDR?

37371393807aa97860048d7160315e983faa680fa9748757a8e1db4fb12025ec4671e1e9.jpg


https://www.gamestar.de/artikel/was...dr-auf-dem-pc-ausprobiert,3271227,seite2.html
 
Those Nvidia results are surprising to say the least. I am playing Destiny 2 right now on a Samsung HDR monitor with Freesync using a Vega 56 Strix at 1440p.

I noticed only a 6 fps loss on average at 1440p and 2-3 fps at 1080p using HDR standing in the same spot but varies at other places. I did turn Freesync off to test this. This game is beautiful in HDR and even though this monitor isn't the best for HDR, the brightness peak highlights can hit 600 nits apparently.

Don't know if that's accurate but it gave me a headache and makes me squint during bright sparks etc.
 
Interesting results.

I won’t lie though, when I saw it was shankly who posted it, I knew it would be because it shows Nvidia not doing as well as AMD. Just like when D.P. makes a post you know chances are it will be pro nvidia or defending nvidia :p:D
 
Surprising results do not change the conclusion
When planning the article, the editors actually assumed that the results would be boring - because the same - and the expectations would only be confirmed. It was then but much more exciting than expected. Because in terms of performance, it seldom matters whether HDR is used or not. Some games lose quite a bit of performance depending on the graphics card.

Depending on the game, there is a different power loss
With HDR must be differentiated at present between a AMD and a Nvidia graphics card. Because the used in the test Radeon RX Vega 64 loses in most titles actually no performance when switching from SDR to HDR. The worst case is Destiny 2 with less than ten percent less FPS and Far Cry 5 with a five percent loss. On average, the use of HDR on the Radeon with two percent loss is not noticeable.

On a Nvidia graphics card, the power loss is currently much higher. The GeForce GTX 1080 loses an average of 10 percent of images per second with "High Dynamic Range", while in Destiny 2 it is even 20 percent. Thus, the loss of speed on a Nvidia graphics card unlike on an AMD accelerator is quite noticeable. And it also shifts the performance structure: Because with SDR there is a tie between the two graphics cards, with HDR but AMD is the front.

The reason for the behavior is unclear
Why Nvidia GPUs have to contend with greater losses than the AMD graphics cards, is currently unclear - even Nvidia could not call the editors any reason. Nevertheless, the recommendation is to use HDR with a corresponding monitor. There are not many games that support HDR yet. However, most of them do look much better. The difference is much bigger than one or two higher levels of detail, so you can reduce them and make the game look more beautiful. However, you may need to reduce more details on a GeForce than on a Radeon.

https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/

I am not surprised, at all. Quite normal and expected results.
What is surprising is where Vega 64 does lose all the gaming performance compared to GTX 1080 Ti.
I would ask AMD to tell us the reason for that.
 
Interesting results.

I won’t lie though, when I saw it was shankly who posted it, I knew it would be because it shows Nvidia not doing as well as AMD. Just like when D.P. makes a post you know chances are it will be pro nvidia or defending nvidia :p:D

Are you suggesting some may be expressing views from a less than objective standpoint? That sounds a bit far fetched to me, that wouldn't happen here surely.
 
NP
----------------
Something to think about.

How abundant are Gsync Montors with HDR? Before you answer read below:
IMHO, Nvidia ("R&D"), knew of the potential performance issues with the use of HDR. Right now Monitors that support HDR isn't close to being at "Full Spec" and we won't likely see TRUE HDR panels from nearly all prospective manufactures in the PC Monitor space until late 2019. I'm willing to assume that Turing (If Nvidia actually does release a next gen GPU and not a refresh) will fully alleviate this bottleneck in its Geforce "tech" by then.

If Turing is a true next gen part so will Nvidia's successor to Gsync be a next gen part. They both will go hand and hand. Call it what you will Gsync 2.0/Gsync 2/Gsync HDR. What ever they do call it it will render current Gsync monitor obsolete. If you recently paid a premium for a Gsync monitor (and lets face it you did) your current Gsync Monitor is now depreciated through forced nutrition. It will no longer hold the value for the premium you paid for it.

Once you get the new, next gen Turing card you will also need to upgrade your monitor for Gsync 2.0 all over again.

This, if it turns out to be true, would make people run to consoles IMO. Sad
 
Last edited:
I am not surprised, at all. Quite normal and expected results.
What is surprising is where Vega 64 does lose all the gaming performance compared to GTX 1080 Ti.
I would ask AMD to tell us the reason for that.

Thats something scratching my head also.
I can see benchmarks and comparing the 1724-1742 Vega 64 (aircooled) with my previous results of the GTX1080 @ 2190 (watercooled)
In half the graphical tests of a benchmark (like TimeSpy, Firestrike), the above Vega is the same if not ~1% faster. On the other half of the tests in the same benchmark 10%-11% drop in fps. CPU is of same power on all benchmarks, ~1% faster than the CPU I had in 2016 on benchmarks
(8600K @ 5Ghz with 4000 Ram DDR4 vs 4930K @ 4.4 with 1066 ram DDR3).

There should be something on the drivers that AMD hasn't found out is missing yet, maybe because nobody has pointed to anyone such discrepancies. (notified Matt about it couple of weeks ago, and he found peculiar also, passing my detailed results to him).
 
If there a HDR solution yet that doesn't depend on "zones" or similar type functionality which become highly obvious after awhile? I've been highly unimpressed by the demos I've seen so far that superficially look good but the illusion quickly breaks down once you start looking more discerningly.
 
Because of its very nature of high dynamic range, I think it somehow loads either the shaders or parts of the back-end of the GPUs more. So, a slight performance regression is normal until there is a dedicated unit which will accelerate these calculations.
 
Thats something scratching my head also.
I can see benchmarks and comparing the 1724-1742 Vega 64 (aircooled) with my previous results of the GTX1080 @ 2190 (watercooled)
In half the graphical tests of a benchmark (like TimeSpy, Firestrike), the above Vega is the same if not ~1% faster. On the other half of the tests in the same benchmark 10%-11% drop in fps. CPU is of same power on all benchmarks, ~1% faster than the CPU I had in 2016 on benchmarks
(8600K @ 5Ghz with 4000 Ram DDR4 vs 4930K @ 4.4 with 1066 ram DDR3).

There should be something on the drivers that AMD hasn't found out is missing yet, maybe because nobody has pointed to anyone such discrepancies. (notified Matt about it couple of weeks ago, and he found peculiar also, passing my detailed results to him).
Some astute yet persistent individuals on another forum found that updating Intels RST fixed a performance related concern in which the game would stream data through the storage device (be it mechanical HD, SSD, M.2). Even though some had an "older version" or a "newer version" from their prospective motherboard manufacturer what fixed it was using this version.
https://downloadcenter.intel.com/do...l-RST-User-Interface-and-Driver?product=35125

Now I want to add that there still maybe another component here. Firmware update for your SSD/M.2 and possible chipset update as well if your MB manufacture provides it.


In any case may I suggest that you try this RST update, reboot and try your result again? Make sure that this is actually compatible with your chipset version.
 
Last edited:
Some astute yet persistent individuals on another forum found that updating Intels RST fixed a performance related concern in which the game would stream data through the storage device (be it mechanical HD, SSD, M.2). Even though some had an "older version" or a "newer version" from their prospective motherboard manufacturer what fixed it was using this version.
https://downloadcenter.intel.com/do...l-RST-User-Interface-and-Driver?product=35125

Now I want to add that there still maybe another component here. Firmware update for your SSD/M.2 and possible chipset update as well if your MB manufacture provides it.


In any case may I suggest that you try this RST update, reboot and try your result again? Make sure that this is actually compatible with your chipset version.

I have the latest version of everything :) Including RST.
but thnx :)
 
I have the latest version of everything :) Including RST.
but thnx :)
About that...Intel doesn't want you using the latest version of RST. Although they go about it in a round about way. That's what sparked the conversation.
RST version being used to fix their issue is:
Version: 16.0.2.1086 Date: 2/21/2018

While the latest version of RST is:
Version: 16.5.1.1030 Date: 7/19/2018 https://downloadcenter.intel.com/do...l-RST-User-Interface-and-Driver?product=35125

However for Version: 16.5.1.1030 Intel's website states:
Which links you back to 16.0.2.1086.
:)
 
About that...Intel doesn't want you using the latest version of RST. Although they go about it in a round about way. That's what sparked the conversation.
RST version being used to fix their issue is:
Version: 16.0.2.1086 Date: 2/21/2018

While the latest version of RST is:
Version: 16.5.1.1030 Date: 7/19/2018 https://downloadcenter.intel.com/do...l-RST-User-Interface-and-Driver?product=35125

However for Version: 16.5.1.1030 Intel's website states:
Which links you back to 16.0.2.1086.
:)
Interesting. I used your initial link to repair & update. Let see how it goes. I will let you know :)
 
Back
Top Bottom