• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now


Cool.



I6wcL3H.png



100GB tech demo.......
 
Last edited:
M

Cool.



I6wcL3H.png



100GB tech demo.......



"NVIDIA also provided an overview of how Neural Rendering can be used, such as RTX Neural Texture Compression, which can create a neural representation of thousands of textures in a very short time, saving up to seven times the VRAM and RAM compared to regular compression tech, and RTX Neural Radiance Cache, which can infer theoretically infinite multi-bounce indirect lighting from just one or two bounces to optimize performance and visuals"

Save vram and storage
Optimized performance
Neural texture shenanigans
Neural radiance what's a does it thingy

Much performance, only 100gb
 
is it just me or is the DLSS version in Call of Duty, crap? I can clearly see ghosting with birds and people parachuting in. But you can't change the DLSS version as it can flag as a cheat
 
This may sound like a daft question but bear with me. When should you use DLSS and when should you not?

My new rig is a 14900KF running at 5Ghz with an Asus Prime OC 5080. I have two 4k screens limited to 60Hz, I mostly play games like Assassins Creed and various MMOs and I favour quality over performance. I assume there is no benefit to me when a GPU can generate FPS > 60fps...

Do I have any real reason to use DLSS? What about other technologies like frame generation, dynamic vibrance, reflex, etc?
 
Last edited:
This may sound like a daft question but bear with me. When should you use DLSS and when should you not?

My new rig is a 14900KF running at 5Ghz with an Asus Prime OC 5080. I have two 4k screens limited to 60Hz, I mostly play games like Assassins Creed and various MMOs and I favour quality over performance. I assume there is no benefit to me when a GPU can generate FPS > 60fps...

Do I have any real reason to use DLSS? What about other technologies like frame generation, dynamic vibrance, reflex, etc?

It honestly depends on the game as some games visually benefit from DLSS over the native AA implementation as well as giving a performance increase.

Only way to know for sure is to experiment.
 
Unpopular opinion, there is no reason to show users resolutions in display options any more. Everything should be borderless fullscreen and performance tuned via scaling the internal render resolution e.g DLSS/FSR or some slider. Games shouldn't be able to take exclusive fullscreen control at all and legacy games that do should be mapped into a borderless window. Even to this day, there are still games that make a complete mess of video output with stupid mode switching, broken HDR (*cough* Elden Ring) and full on crashing (Mafia Remastered). The OS should be in control of video output. You'd think exclusive fullscreen going away in DX12 would be the end of this.
 
Last edited:
Unpopular opinion, there is no reason to show users resolutions in display options any more. Everything should be borderless fullscreen and performance tuned via scaling the internal render resolution e.g DLSS/FSR or some slider. Games shouldn't be able to take exclusive fullscreen control at all and legacy games that do should be mapped into a borderless window. Even to this day, there are still games that make a complete mess of video output with stupid mode switching, broken HDR (*cough* Elden Ring) and full on crashing (Mafia Remastered). The OS should be in control of video output. You'd think exclusive fullscreen going away in DX12 would be the end of this.

Taking options away is not a good thing, It's on devs to do better implementations and not outright remove several things.
 
Taking options away is not a good thing, It's on devs to do better implementations and not outright remove several things.
It absolutely is. I don't want game developers having any control over the video mode I have chosen. If my desktop is set to my native 4K 120Hz, I don't want some janky game forcing a 60Hz mode switch or some broken HDR output that requires me to switch off HDR in Windows. The whole job of the OS is to manage this kind of thing and protect from applications ******* things up.. which they do.
 
So if your computer can only generate 25fps at 4k, but you have the OS fixed at 120hz for basic general use, it sounds like you would be happier running said game at 25fps?

Unless I'm mistaken, can't follow that logic.
 
So if your computer can only generate 25fps at 4k, but you have the OS fixed at 120hz for basic general use, it sounds like you would be happier running said game at 25fps?

Unless I'm mistaken, can't follow that logic.
I'm not sure I follow. If I could only reach 25fps, I'd tune render resolution and settings to hit a multiple of 120 e.g 30fps (assuming I'm not using VRR). Or are you saying that the game should force the nearest closest divisible mode.. which in the case of most 4K TVs would be 50Hz?

I don't disagree that it's sometimes useful to run a lower refresh rate e.g using 100Hz when you can't quite maintain 120fps - but that decision should be for the *user* and not the game.

If I look at the EDID for my display, 100/120Hz are considered 'PC' modes and 60Hz is considered a TV mode i.e 4K 60Hz is considered 'UHD'. PC modes have no processing (it's basically acting like a big monitor) ergo I always want to use 100 or 120Hz. Some games e.g Elden Ring do not know how to do video output properly so force a 60Hz mode switch on startup (probably because the game is limited to 60fps). Mafia Remastered also forces 60Hz AND locks up if fullscreen is used. Pretty impressive given it's a Unreal engine game and they're not even responsible for video output. Resident Evil 2 & 3 and also Elden Ring, can't do HDR properly. They force a switch to a HDR video mode on startup as a hack. If you have HDR turned on in Windows (like every other HDR game needs), HDR will be broken and washed out.

TLDR; I can't think of a single good reason why games should have the ability to switch video modes of my PC.
 
I'm not sure I follow. If I could only reach 25fps, I'd tune render resolution and settings to hit a multiple of 120 e.g 30fps (assuming I'm not using VRR). Or are you saying that the game should force the nearest closest divisible mode.. which in the case of most 4K TVs would be 50Hz?
My mistake, ignore me - I thought you were saying you would want the game to force trying to get to 120hz even if it had no chance of doing so.
 
Back
Top Bottom