• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

**THE NVIDIA DRIVERS THREAD**

Looks like gamingtech (hdr enthusiast) somewhat rates the RTX HDR feature.


If nvidia can get this to work pretty well in games, they'll have another must have feature imo.
 
Meh.

Being able to turn that on could probably solve the noise issues.

And it would increase performance even further :)

If you look closely at that side by side comparison (look at the 1:15 mark), there appears to be a fairly vast downgrade to the quality of the ray/path tracing as bounce lighting and shadow edges have a lot of artifacts not present in the older footage.. this is likely where this boost in performance is coming from.

:o

Hopefully that's just due to ray reconstruction being turned off as it's incompatible with that driver and/or windows build which would actually increase performance and IQ much more.
 
Last edited:
DLSS Performance at 4K is the equivalent of Quality at 1440p so it makes no difference there and maintains better 4k FPS as a result.

This is normal practice when upscaling.
 
  • Like
Reactions: TNA
DLSS Performance at 4K is the equivalent of Quality at 1440p so it makes no difference there and maintains better 4k FPS as a result.

This is normal practice when upscaling.

Yeah. But I would much rather 4K DLSS at Performance than 1440p DLSS at Quality.
 
Yeah. But I would much rather 4K DLSS at Performance than 1440p DLSS at Quality.

I would say that's purely down to the individual game. Not all games load high Q textures at 1440P whereas they do at 4K, regardless of upscaling.

As a user of both, I can say that there is little visual difference between the two in the games I've played and still play,, and I've posted side by side comparisons of both in various threads and nobody had been able to point out noticeable differences in them. The min fps in the game bench shown will vary, even a single frame hitch dropping to 30fps will still register as the min fps rather than ignoring the single hitch which may not even be noticed depending on when it happens, like right at the start of the bench run etc. It's the average fps to pay attention to.

5160x2160 DLSS-P:

3440x1440 DLSS-Q

Other than one being higher res, but when scaled to the same physical dimensions, can you actually see any differences in those two? Like load both in 2 tabs then full screen the browser and see what differences you spot.

Genuine question. How powerful is the 7700x?
Apparently it's on par and similarly priced to a 12700K which is what I have, and that CPU isn't bottlenecking a 4090 at 1440P or above with ray traced/path traced games as these are not settings that are CPU bound. You'd probably expect to see some difference to the % lows however but not a huge deal.

As a comparison here is my 5160x2160 benchmark in cyberpunk vs the video posted earlier of the Compusemble bench with 7700X at 3840x2160 with the same settings otherwise.

Their results:
B0i4RV8.png


My result:
dZvFwWr.png


Edit* updated screengrab from their video, had the FG off one on before and didn't realise.
 
Last edited:
I would say that's purely down to the individual game. Not all games load high Q textures at 1440P whereas they do at 4K, regardless of upscaling.

As a user of both, I can say that there is little visual difference between the two in the games I've played and still play,, and I've posted side by side comparisons of both in various threads and nobody had been able to point out noticeable differences in them. The min fps in the game bench shown will vary, even a single frame hitch dropping to 30fps will still register as the min fps rather than ignoring the single hitch which may not even be noticed depending on when it happens, like right at the start of the bench run etc. It's the average fps to pay attention to.

5160x2160 DLSS-P:

3440x1440 DLSS-Q

Other than one being higher res, but when scaled to the same physical dimensions, can you actually see any differences in those two? Like load both in 2 tabs then full screen the browser and see what differences you spot.


Apparently it's on par and similarly priced to a 12700K which is what I have, and that CPU isn't bottlenecking a 4090 at 1440P or above with ray traced/path traced games as these are not settings that are CPU bound. You'd probably expect to see some difference to the % lows however but not a huge deal.

As a comparison here is my 5160x2160 benchmark in cyberpunk vs the video posted earlier of the Compusemble bench with 7700X at 3840x2160 with the same settings otherwise.

Their results:
MBrbxpN.png


My result:
dZvFwWr.png


My average fps is higher than their results with the beta bodge driver and Windows 11 insider build, and I'm running at above 4K res lol.

So given that my frame performance consistency appears to be better whilst rendering at a higher than 4K res, I'd say the 7700x falls a bit behind the 12700K with path tracing etc all up, but both are quite capable CPUs to feed a 4090 as you're still GPU bound on a 4090 at these settings. I can't run the game at 3840x2160 for a more even comparison as I'm on ultrawide but this still gives you an idea.
Why are you using compressed jpeg images for the comparison?
 
Makes zero difference as an overview, my source screenshots are 100% quality jpg, so visually identical to a PNG version, just several MB smaller in filesize. I don't know what IMGur use for compression, but it's perfectly acceptable for a general comparison to determine texture sharpness, LOD etc.
 
I would say that's purely down to the individual game. Not all games load high Q textures at 1440P whereas they do at 4K, regardless of upscaling.

As a user of both, I can say that there is little visual difference between the two in the games I've played and still play,, and I've posted side by side comparisons of both in various threads and nobody had been able to point out noticeable differences in them. The min fps in the game bench shown will vary, even a single frame hitch dropping to 30fps will still register as the min fps rather than ignoring the single hitch which may not even be noticed depending on when it happens, like right at the start of the bench run etc. It's the average fps to pay attention to.

5160x2160 DLSS-P:

3440x1440 DLSS-Q

Other than one being higher res, but when scaled to the same physical dimensions, can you actually see any differences in those two? Like load both in 2 tabs then full screen the browser and see what differences you spot.


Apparently it's on par and similarly priced to a 12700K which is what I have, and that CPU isn't bottlenecking a 4090 at 1440P or above with ray traced/path traced games as these are not settings that are CPU bound. You'd probably expect to see some difference to the % lows however but not a huge deal.

As a comparison here is my 5160x2160 benchmark in cyberpunk vs the video posted earlier of the Compusemble bench with 7700X at 3840x2160 with the same settings otherwise.

Their results:
MBrbxpN.png


My result:
dZvFwWr.png


My average fps is higher than their results with the beta bodge driver and Windows 11 insider build, and I'm running at above 4K res lol.

So given that my frame performance consistency appears to be better whilst rendering at a higher than 4K res, I'd say the 7700x falls a bit behind the 12700K with path tracing etc all up, but both are quite capable CPUs to feed a 4090 as you're still GPU bound on a 4090 at these settings. I can't run the game at 3840x2160 for a more even comparison as I'm on ultrawide but this still gives you an idea.

Frame gen on for your run?
 
Makes zero difference as an overview, my source screenshots are 100% quality jpg, so visually identical to a PNG version, just several MB smaller in filesize. I don't know what IMGur use for compression, but it's perfectly acceptable for a general comparison to determine texture sharpness, LOD etc.
JPEG is lossy, even at 100% quality it will still lose detail which is important for a comparison, and if imgur add their own JPEG compression then they have lost detail twice.
 
Last edited:
As a sidenote, there is definitely something off about this current GRD/Studio driver in terms of display sleep/wakeup. I thought maybe it was the AW3423DW before but this does not seem to be the case since it's happened twice today. When Windows sends the display sleep command after 20mins, the monitor goes to sleep, and it sometimes doens't wake up as it goes to a deep power down state and only power cycling the wall plug of the monitor brings it back.

I was working on my work laptop earlier and switched input back to my PC as I use the AW for both, and as soon as the display switched back to the PC, the AW shut down and would not come back on until it was power cycled. Windows made the device disconnected bong when it shut down. I had been using the work laptop for over 20 mins so naturally my PC had sent the command to sleep the AW which it then recognised when I manually switched input back to the PC.

I have disabled "turn off display after x" in Windows, so will maunally sleep the display when I need to until the next NV driver release.

Frame gen on for your run?
Ah good catch, I actually missed that, updated with FG on screengrab from their video. Factoring in the resolution difference, and the beta driver fps uplift (whether optimised driver or reduced RT quality to boost fps or otherwise) - This fps difference is more obvious yeah.

JPEG is lossy, even at 100% quality it will still lose detail which is important for a comparison, and if imgur add their own JPEG compression then they have lost detail twice.
It's not a notewrthy or noticeable difference in this context. I've been working with image formats with varying qualities all of my professional life.
 
Last edited:
As a sidenote, there is definitely something off about this current GRD/Studio driver in terms of display sleep/wakeup. I thought maybe it was the AW3423DW before but this does not seem to be the case since it's happened twice today. When Windows sends the display sleep command after 20mins, the monitor goes to sleep, and it sometimes doens't wake up as it goes to a deep power down state and only power cycling the wall plug of the monitor brings it back.

Exactly my experience too with the same monitor, Had to unplug the monitor at the power strip to sort the issue out, Only started with this driver.
 
Last edited:
Back
Top Bottom