Nvidia DSR

Associate
Joined
17 Jul 2015
Posts
1,206
Just got a new 1440p monitor and have DSR enabled and wondered if this has an impact on my fps?

Is it any good or should i just set my games to the native 1440p?

Cheers.
 
Well at the moment the standard settings are 4k. As I'm gaming at 1440p on a gtx970 do you think it would be best to turn it off and just render in my native resolution?
 
Well at the moment the standard settings are 4k. As I'm gaming at 1440p on a gtx970 do you think it would be best to turn it off and just render in my native resolution?

Personally I'd run native at higher quality settings. rendering at 4K will push the VRAM usage over the 3.5GB mark in a lot of games. GTX970s don't behave well when you use more than 3.5GB of VRAM. I push for very high frame rates instead on my SLI setup.
 
That's a great help cheers lads. I'll have a play tonight.
Quick rundown:

With modern games, always just run at your native res to start. So 1440p. With a single GTX970, that might already not be enough for 60fps. See what your performance is like. Bumping to 4k is a very big jump in pixels needing to be rendered, and is extremely taxing on performance. You can only do this if you've got a *lot* of performance overhead, which you probably wont have for the vast majority of modern games. If you're tolerant of 30fps, then it might open up more options for you, but I'd trade 60fps over the IQ benefit any day. Even then, many games probably wont even run at a consistent 30fps at 4k with a single 970.

Stick with 1440p for the most part. DSR is only there to provide you options to play with, it is not meant to be something where you suddenly play every game at some higher resolution. If you've got a lot of performance overhead(you're running well over 60fps) in a specific game, then try bumping up the resolution. Doesn't need to be 4k, either. There are steps in between that might hit the right IQ/performance balance better. For instance, I run Dark Souls 2 at 1800p or something like that. 2160p(4k) is just a bit too much and I start to get more frequent framedrops.

Another thing to consider is that not every game scales the UI well with increased resolutions. Some games become downright unplayable at resolutions like 4k because the HUD/UI becomes way too small. So you obviously wouldn't want to use DSR in these cases.

Basically, what resolution you use should be chosen on an individual basis. Maybe some games you want to sacrifice some performance for extra IQ. Maybe a different game you don't. Maybe one game just doesn't play well with 4k. Etc etc
 
Last edited:
Quick rundown:

With modern games, always just run at your native res to start. So 1440p. With a single GTX970, that might already not be enough for 60fps. See what your performance is like. Bumping to 4k is a very big jump in pixels needing to be rendered, and is extremely taxing on performance. You can only do this if you've got a *lot* of performance overhead, which you probably wont have for the vast majority of modern games. If you're tolerant of 30fps, then it might open up more options for you, but I'd trade 60fps over the IQ benefit any day. Even then, many games probably wont even run at a consistent 30fps at 4k with a single 970.

Stick with 1440p for the most part. DSR is only there to provide you options to play with, it is not meant to be something where you suddenly play every game at some higher resolution. If you've got a lot of performance overhead(you're running well over 60fps) in a specific game, then try bumping up the resolution. Doesn't need to be 4k, either. There are steps in between that might hit the right IQ/performance balance better. For instance, I run Dark Souls 2 at 1800p or something like that. 2160p(4k) is just a bit too much and I start to get more frequent framedrops.

Another thing to consider is that not every game scales the UI well with increased resolutions. Some games become downright unplayable at resolutions like 4k because the HUD/UI becomes way too small. So you obviously wouldn't want to use DSR in these cases.

Basically, what resolution you use should be chosen on an individual basis. Maybe some games you want to sacrifice some performance for extra IQ. Maybe a different game you don't. Maybe one game just doesn't play well with 4k. Etc etc

Another way to look at it is that rendering at 1440p will require less AA to smooth the jaggies. I tend to only play games like Skyrim and Mass Effect. Mass Effect 3 at 1440p with 4x MSAA is stunning, and I'm sure running it at true 1440p would be even better visually.
 
Quick rundown:

With modern games, always just run at your native res to start. So 1440p. With a single GTX970, that might already not be enough for 60fps. See what your performance is like. Bumping to 4k is a very big jump in pixels needing to be rendered, and is extremely taxing on performance. You can only do this if you've got a *lot* of performance overhead, which you probably wont have for the vast majority of modern games. If you're tolerant of 30fps, then it might open up more options for you, but I'd trade 60fps over the IQ benefit any day. Even then, many games probably wont even run at a consistent 30fps at 4k with a single 970.

Stick with 1440p for the most part. DSR is only there to provide you options to play with, it is not meant to be something where you suddenly play every game at some higher resolution. If you've got a lot of performance overhead(you're running well over 60fps) in a specific game, then try bumping up the resolution. Doesn't need to be 4k, either. There are steps in between that might hit the right IQ/performance balance better. For instance, I run Dark Souls 2 at 1800p or something like that. 2160p(4k) is just a bit too much and I start to get more frequent framedrops.

Another thing to consider is that not every game scales the UI well with increased resolutions. Some games become downright unplayable at resolutions like 4k because the HUD/UI becomes way too small. So you obviously wouldn't want to use DSR in these cases.

Basically, what resolution you use should be chosen on an individual basis. Maybe some games you want to sacrifice some performance for extra IQ. Maybe a different game you don't. Maybe one game just doesn't play well with 4k. Etc etc

Thank for spending the time to do that Seanspeed that's a great help. I get it now. Great explanation.
 
Another way to look at it is that rendering at 1440p will require less AA to smooth the jaggies. I tend to only play games like Skyrim and Mass Effect. Mass Effect 3 at 1440p with 4x MSAA is stunning, and I'm sure running it at true 1440p would be even better visually.

My native res is 1440p so should look great with dsr disabled. :)
 
Another way to look at it is that rendering at 1440p will require less AA to smooth the jaggies. I tend to only play games like Skyrim and Mass Effect. Mass Effect 3 at 1440p with 4x MSAA is stunning, and I'm sure running it at true 1440p would be even better visually.
If you're playing slightly older games, absolutely crank up the resolution. That said, 1440p downsampled to 1080p(what I'm guess you're doing) will not necessarily have much improved anti-aliasing over native 1440p. You'll get an image with far more clarity, but the general aliasing improvements of downsampling are a close match for the aliasing improvements you get by using higher resolution monitors. So you'd still might want to use 4xMSAA even with native 1440p in Mass Effect 3.

Kinda depends on the game and how bad the aliasing is(or how sensitive to it you are). Even games that I run at 4k, I still use FXAA/SMAA, at least. And seeing how many next-gen game engines are using deferred rendering, MSAA has become extremely costly, so I've found that when downsampling, FXAA/SMAA are definitely still either necessary or at least very nice to have.
 
Back
Top Bottom