Soldato
Looks like denoiser is broken
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Best HDR gaming monitor there is and you're wasting itI'll still play in SDR
Meh.
Being able to turn that on could probably solve the noise issues.
If you look closely at that side by side comparison (look at the 1:15 mark), there appears to be a fairly vast downgrade to the quality of the ray/path tracing as bounce lighting and shadow edges have a lot of artifacts not present in the older footage.. this is likely where this boost in performance is coming from.
I just wonder if the 4090 might be a little bottlenecked?
DLSS Performance at 4K is the equivalent of Quality at 1440p so it makes no difference there and maintains better 4k FPS as a result.
This is normal practice when upscaling.
If you want I can run some games on my 4090 that you also have using my 13900K and we can compare notes ? May be a useful point of reference for you ?
Yeah. But I would much rather 4K DLSS at Performance than 1440p DLSS at Quality.
Apparently it's on par and similarly priced to a 12700K which is what I have, and that CPU isn't bottlenecking a 4090 at 1440P or above with ray traced/path traced games as these are not settings that are CPU bound. You'd probably expect to see some difference to the % lows however but not a huge deal.Genuine question. How powerful is the 7700x?
Why are you using compressed jpeg images for the comparison?I would say that's purely down to the individual game. Not all games load high Q textures at 1440P whereas they do at 4K, regardless of upscaling.
As a user of both, I can say that there is little visual difference between the two in the games I've played and still play,, and I've posted side by side comparisons of both in various threads and nobody had been able to point out noticeable differences in them. The min fps in the game bench shown will vary, even a single frame hitch dropping to 30fps will still register as the min fps rather than ignoring the single hitch which may not even be noticed depending on when it happens, like right at the start of the bench run etc. It's the average fps to pay attention to.
5160x2160 DLSS-P:
3440x1440 DLSS-Q
Other than one being higher res, but when scaled to the same physical dimensions, can you actually see any differences in those two? Like load both in 2 tabs then full screen the browser and see what differences you spot.
Apparently it's on par and similarly priced to a 12700K which is what I have, and that CPU isn't bottlenecking a 4090 at 1440P or above with ray traced/path traced games as these are not settings that are CPU bound. You'd probably expect to see some difference to the % lows however but not a huge deal.
As a comparison here is my 5160x2160 benchmark in cyberpunk vs the video posted earlier of the Compusemble bench with 7700X at 3840x2160 with the same settings otherwise.
Their results:
My result:
My average fps is higher than their results with the beta bodge driver and Windows 11 insider build, and I'm running at above 4K res lol.
So given that my frame performance consistency appears to be better whilst rendering at a higher than 4K res, I'd say the 7700x falls a bit behind the 12700K with path tracing etc all up, but both are quite capable CPUs to feed a 4090 as you're still GPU bound on a 4090 at these settings. I can't run the game at 3840x2160 for a more even comparison as I'm on ultrawide but this still gives you an idea.
I would say that's purely down to the individual game. Not all games load high Q textures at 1440P whereas they do at 4K, regardless of upscaling.
As a user of both, I can say that there is little visual difference between the two in the games I've played and still play,, and I've posted side by side comparisons of both in various threads and nobody had been able to point out noticeable differences in them. The min fps in the game bench shown will vary, even a single frame hitch dropping to 30fps will still register as the min fps rather than ignoring the single hitch which may not even be noticed depending on when it happens, like right at the start of the bench run etc. It's the average fps to pay attention to.
5160x2160 DLSS-P:
3440x1440 DLSS-Q
Other than one being higher res, but when scaled to the same physical dimensions, can you actually see any differences in those two? Like load both in 2 tabs then full screen the browser and see what differences you spot.
Apparently it's on par and similarly priced to a 12700K which is what I have, and that CPU isn't bottlenecking a 4090 at 1440P or above with ray traced/path traced games as these are not settings that are CPU bound. You'd probably expect to see some difference to the % lows however but not a huge deal.
As a comparison here is my 5160x2160 benchmark in cyberpunk vs the video posted earlier of the Compusemble bench with 7700X at 3840x2160 with the same settings otherwise.
Their results:
My result:
My average fps is higher than their results with the beta bodge driver and Windows 11 insider build, and I'm running at above 4K res lol.
So given that my frame performance consistency appears to be better whilst rendering at a higher than 4K res, I'd say the 7700x falls a bit behind the 12700K with path tracing etc all up, but both are quite capable CPUs to feed a 4090 as you're still GPU bound on a 4090 at these settings. I can't run the game at 3840x2160 for a more even comparison as I'm on ultrawide but this still gives you an idea.
JPEG is lossy, even at 100% quality it will still lose detail which is important for a comparison, and if imgur add their own JPEG compression then they have lost detail twice.Makes zero difference as an overview, my source screenshots are 100% quality jpg, so visually identical to a PNG version, just several MB smaller in filesize. I don't know what IMGur use for compression, but it's perfectly acceptable for a general comparison to determine texture sharpness, LOD etc.
JPEG is lossy, even at 100% quality it will still lose detail which is important for a comparison, and if imgur add their own JPEG compression then they have lost detail twice.
Ah good catch, I actually missed that, updated with FG on screengrab from their video. Factoring in the resolution difference, and the beta driver fps uplift (whether optimised driver or reduced RT quality to boost fps or otherwise) - This fps difference is more obvious yeah.Frame gen on for your run?
It's not a notewrthy or noticeable difference in this context. I've been working with image formats with varying qualities all of my professional life.JPEG is lossy, even at 100% quality it will still lose detail which is important for a comparison, and if imgur add their own JPEG compression then they have lost detail twice.
As a sidenote, there is definitely something off about this current GRD/Studio driver in terms of display sleep/wakeup. I thought maybe it was the AW3423DW before but this does not seem to be the case since it's happened twice today. When Windows sends the display sleep command after 20mins, the monitor goes to sleep, and it sometimes doens't wake up as it goes to a deep power down state and only power cycling the wall plug of the monitor brings it back.