• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Is the recording an mp4 file or something else on Radeon?

I have not used Windows Auto HDR so unsure as to how it is outputting the image, is it some internal luminance boosting or is it actually generating a 10-bit output becuas ethat's what proper HDR is doing and recorded videos should be 10-bit colour depth which contains the extra data for the HDR range, as well as associated metadata to tell whatever playe ris used to watch the video that this is HDR and to use the extra data to output as HDR.

It could be that Auto HDR doesn't do this and as such is for display in person only and not recording as it's not passing through the metadata and/or 10-bit colour.

You'll know if it's one or the other if you play back locally the recorded video in something like MPC-HC/MPC-BE with the MPC Video Renderer enabled as that auto triggers Windows HDR mode when playing HDR videos, and viewing the codec properties of the video you will see that it has 10-bit colour.

YouTube can sometimes take days to process the HDR version of an upload too assuming all the specs are met (10-bit and metadata). If you transcode using Handbrake, then you also have to select the encoder that supports 10-bit as shown in the dropdown, and make sure passthrough for metadata checkbox is checked.

None of your videos above show the HDR option on youtube, I uploaded 2 videos in the last 24 hours for example and only one of them:


Shows HDR as one of the quality options, the other may well be ready tomorrow even though 4k60fps is available for both. It may even take days to be ready such is the lottery of youtube processing queues for HDR. Thankfully uploaded HDR videos are internally tonemapped by youtube to SDR so whilst the HDR processes, the tonemapped SDR version should be still excellent, just in SDR. The tonemapping will only be done correctly if the uploaded file meets all the specs of a source HDR video as well, otherwise you are just gonna get a dulled video like you see in the local file playback.

Edit*

Just seen this from IGN Live, related to the thread topic no less:

 
Last edited:
@mrk that's a lot of good info. Its helped me figure it out.

Its .mp4

None of my recording are 10 Bit, not in AV1, not AVC, not HEVC, its all 8 Bit, i have even turned off Auto HDR and turned on HDR in Metro, which to my surprise produced a slightly better picture than Auto HDR, usually the ingame HDR is pap to my eye, but Metro looks good.
However even doing that the Radeon software still recorded in 8 Bit.

So basically if i'm rendering a 10 Bit image the AMD app just records it in 8 Bit, which is not HDR, my screen is, quite a few sub £500 screens that are advertised as HDR are actually 8 Bit, you have to be careful and read the full data sheet, i don't know how they get away with calling an 8 Bit screen HDR, They don't even bother to call it "HDR Ready" Remember that?
My screen is not high in Nits but it does display a proper HDR image and it is very different to SDR, i do wonder what these 8 Bit 'HDR' screens actually look like... 10 Bit vs 8 Bit is the difference between 16.7 Million colours and 1.07 Billion colours. Or 1070 Million for a more contextual comparison.
What i would say to people who say "HDR is Meh" oh? What's your Bit Depth? :D

Anyway... so that's disappointing, AMD, or...... and i'm 90% clutching at straws here, maybe its something with my Windows? Your HDR video is not being displayed in HDR on my Desktop, it looks, well.... like the second linked image below, and it looks exactly the same played back in MPC-BE with the MPC Video Renderer enabled. (Yes i downloaded the video) looks like a fun game BTW.

i need more info, i wonder if i'll get an answer on r/amd or their Discord.

Your downloaded Video file.

RQIxY0v.png


How it looks on my Desktop through Youtube. *Vomits*

sD4VYGA.png


My Screen.

jhxOVsk.png
 
Last edited:
That could have something to do with it although even a monitor that isn't strictly certified for HDR can display HDR content, even if it's at say 300 nits - I have a 180Hz Xiaomi 34" ultrawide here too that is just like that, uses 8-bit+FRC to display a higher colour count which is normally fine, it just won't be a certified output so no guarantee that things will look good or be compliant.

The key thing to look for in the videos recorded though is that it's 10-bit, contains the BT.2020 colour profile (also used by HDR) and the file is in a container that supported supports it such as .mp4.

The rest will be down to the recording tool, AMD tool or Nvidia App etc. Sounds like the initial issue on yours was with Windows Auto HDR, Windows HDR mode needs to be enabled along with the game's HDR mode too in order for the tool to recognise HDR is in effect.
 
for everyone’s sake, don’t use auto hdr in windows ever if you value quality. Its bad on many levels.

If a game has a good hdr implementation, use native hdr. If not, use nvidia hdr.

As for hdr impact itself, it’s very much a game of you get what you pay for. TV’s like my s90c modded to 1900nits are miles ahead of monitors (like my aw3423dw) and QD oled monitors are miles ahead of lcd based monitors.
 
The HDR feature on my monitor just makes the screen really white and washed out as from my limited understanding of it when I was looking at a cheep monitor that to get decent HDR a monitor with good brigtness levels and many lighting zones is needed.

My Acer monitor is a HDR400 with no lighting zones but for the £150 I paid for 1440p/VRR/170hz it its great compared to my old 2014 Dell.
 
Yeah for a proper HDR experience you really do want an OLED display whether monitor or TV. The higher the nits the greater the experience. The AW3423DW in its HDR Peak 1000 mode is a thing to behold for HDR gaming, also you must use the Windows HDR Calibration tool to profile the display first otherwise Windows HDR won't know how much luminance to send to the display and things will look wrong.
 
Yeah for a proper HDR experience you really do want an OLED display whether monitor or TV. The higher the nits the greater the experience. The AW3423DW in its HDR Peak 1000 mode is a thing to behold for HDR gaming, also you must use the Windows HDR Calibration tool to profile the display first otherwise Windows HDR won't know how much luminance to send to the display and things will look wrong.
I've just discovered this HDR setting in Windows lol! :cry: And yes I agree, MS also tell you to run the HDR Calibration tool. I did and helped to reign in the brightness/luminance as the first thing I thought when turning it on was it was far too bright. Now this is enabled in Windows, do you have to turn it on in games as well? I think that is what you were also saying.
 
for everyone’s sake, don’t use auto hdr in windows ever if you value quality. Its bad on many levels.

If a game has a good hdr implementation, use native hdr. If not, use nvidia hdr.

As for hdr impact itself, it’s very much a game of you get what you pay for. TV’s like my s90c modded to 1900nits are miles ahead of monitors (like my aw3423dw) and QD oled monitors are miles ahead of lcd based monitors.

Nvidia hdr does not work in every game unfortunately. Soulcalibur 6 for example. And that game looks so much better with auto hdr on than off.
 
I've just discovered this HDR setting in Windows lol! :cry: And yes I agree, MS also tell you to run the HDR Calibration tool. I did and helped to reign in the brightness/luminance as the first thing I thought when turning it on was it was far too bright. Now this is enabled in Windows, do you have to turn it on in games as well? I think that is what you were also saying.
So here's the thing, Windows is still crap at HDR when SDR content is being displayed, which is basically 99% of everything else outside of watching a HDR video or playing a HDR game, so the best way to go about this is to only enable the HDR mode in Windows when you are about to play a HDR game. If you are about to watch a HDR movie then use a player that does the auto enabling for your like MPC-BE with the built-in MPC Video renderer, it will then drop Windows back toSDR mode when the movie finishes.

To automate HDR mode for games you can use a free tool that I eep forgetting the name of but has been posted on the forums a bunch of times, or just press WINKEY+B which uses the XBOX game app to toggle the feature in Windows via the keyboard shortcut if you find right clicking desktop > display settings then toggling HDR a bit of a faff.

And yes you still need to enable HDR in the game you are playing once Windows HDR mode is enabled otherwise the game has no idea you are on a HDR display. The game will remember this setting the next time you play as long as Windows HDR is on before the game loads. The automation app will return back to SDR when the game is exited, otherwise you will need to toggle back with WINKEY+B after you exit the game manually.

Maybe in Windows 12 MS will sort out HDR so that it can be left on fulltime like on MacOS and SDR content is displayed correctly in terms of colour tones and luminance but who knows.
 
So here's the thing, Windows is still crap at HDR when SDR content is being displayed, which is basically 99% of everything else outside of watching a HDR video or playing a HDR game, so the best way to go about this is to only enable the HDR mode in Windows when you are about to play a HDR game. If you are about to watch a HDR movie then use a player that does the auto enabling for your like MPC-BE with the built-in MPC Video renderer, it will then drop Windows back toSDR mode when the movie finishes.

To automate HDR mode for games you can use a free tool that I eep forgetting the name of but has been posted on the forums a bunch of times, or just press WINKEY+B which uses the XBOX game app to toggle the feature in Windows via the keyboard shortcut if you find right clicking desktop > display settings then toggling HDR a bit of a faff.

And yes you still need to enable HDR in the game you are playing once Windows HDR mode is enabled otherwise the game has no idea you are on a HDR display. The game will remember this setting the next time you play as long as Windows HDR is on before the game loads. The automation app will return back to SDR when the game is exited, otherwise you will need to toggle back with WINKEY+B after you exit the game manually.

Maybe in Windows 12 MS will sort out HDR so that it can be left on fulltime like on MacOS and SDR content is displayed correctly in terms of colour tones and luminance but who knows.
In windows 11 you can adjust brightness of SDR content when HDR is enabled.
It looks ok to me colour wise but I’m seeing some banding which is not visible in SDR mode.
 
Yeah the SDR slider is there but as above, Windows is not good with SDR content display when HDR mode is enabled, unlike MacOS which is basically perfect. Hence why you should not be leaving HDR mode enabled and only use it when you are playing a HDR game.
 
Last edited:
Do HDR calibration in windows. This is to set the tone mapping for hdr applications.

Keep the display in SDR for all SDR viewing

Use win+alt+b to enable HDR manually before viewing HDR content

When done, use win+alt+b to switch back to sdr

That’s the only process if you want sdr and hdr accuracy. Otherwise it’s subjective preferences not worth discussing.
 
It's absolutely relevant. Competition is good and they aren't providing enough of that.
Who forces them? Even in best times, when AMD had clearly better GPUs, people preferred to buy NVIDIA. So, people got exactly what they worked so hard for for so many years. :)

Not all of us are brain dead fanboys of one side or the other, some of us want competing companies so that we have more choice.
People often say so, but then do as mentioned above and results we all see very clearly. There's not enough monies for AMD to try harder here, so they don't.
 
The way I see this going is if AMD can get a hold of the burgeoning handheld PC market and work their graphics cards up from there then I can see them making money and gaining a bit of market share. If I was in AMD's position forget the high end, it's clearly high margin for Nvidia but the numbers are really in the 4070 series/7800XT/7900GRE and under market. AMD might be a bit more power hungry at the high end but they are also quite performant and power frugal at the lower end of the scale.
 
for everyone’s sake, don’t use auto hdr in windows ever if you value quality. Its bad on many levels.

If a game has a good hdr implementation, use native hdr. If not, use nvidia hdr.
Before RTX HDR, AutoHDR was good enough after calibration on my OLED - in quite a few games I directly compared them (again, not on the stock settings) and difference isn't that great. Neither is even close to proper native HDR in games either. Yes, RTX HDR is a bit better than AutoHDR, but AutoHDR works in a few games where NVIDIA one's doesn't, it's very usable, and unlike NVIDIA's one it has no FPS hit (RTX HDR easily eats 10%+ of my FPS on 4090).
 
Last edited:
The way I see this going is if AMD can get a hold of the burgeoning handheld PC market and work their graphics cards up from there then I can see them making money and gaining a bit of market share.
Handhelds are a different market than PC or traditional consoles though. And NVIDIA is already a king there as they're in every Switch, which all handhelds based on AMD will not outsell. Coming Switch 2 is also NVIDIA. I've tried to comb the internet to see some numbers, but it seems all of the AMD based handhelds have sales numbers in at most tens of thousands, combined. Switch is close to 120 mil sold, so far. I expect Switch 2 to also sell very well. AMD will make much more monies on laptops and consoles, most likely.

If I was in AMD's position forget the high end, it's clearly high margin for Nvidia but the numbers are really in the 4070 series/7800XT/7900GRE and under market. AMD might be a bit more power hungry at the high end but they are also quite performant and power frugal at the lower end of the scale.
This is what they're about to do - focus on mid range GPUs with the coming gen, according to all the rumours I've seen so far. And it wouldn't be the first time. They also seem to be switching to annual releases, same as NVIDIA.
 
Back
Top Bottom