• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

One of images looks like it has more sharpening applied. Its why unsharp masks works when I develop images from RAW. The "native" image looks unusually soft.

Yes we have all been tricked by using fxaa to make native look worse.

If dlss is good let it stand on it's own, there is no need to try and trick people into liking it by using fxaa on one and not the other and doing stupid stuff like lying about what picture is what.

I certainly don't have any sort of agenda here to make dlss seem worse than it is. The game looks terrible and running at stupid resolutions most people gave up years ago.

That explains it then. Its not the first or last time AMD or Nvidia have done this. For example PhysX on and PhysX off,when the latter had normal physics animations removed from some games,to emphasise the difference. Then you have had earlier games such as Red Faction which could do insane physics on old CPUs. Same with TressFX and Hairworks,software based hair animation techniques existed,but look at how flat the hair looks in games promoting such technology,without TressFX/Hairworks switched on.
 
Last edited:
One of images looks like it has more sharpening applied. Its why unsharp masks works when I develop images from RAW. The "native" image looks unusually soft.



That explains it then.
That is why he never tricked anyone as we all said the image looked blurry, which it would as he applied fxaa to it. Lol. If anything he proved me right, as those were my only words about the image in question :D
 
That is why he never tricked anyone as we all said the image looked blurry, which it would as he applied fxaa to it. Lol. If anything he proved me right, as those were my only words about the image in question :D

I really never understood the point of FXAA - I assume as it requires less GPU resources it was developed for consoles?? The softness wouldn't be so noticeable on a TV many feet away?
 
Was surprised about the artifacts then.

Explains why detail was missing on things like the federal logo etc.

Image looked better than I expected. The other images you showed looked "challenged", when not knowing when to apply things like focus at different levels.

Beginning to wonder how we know if manufacturers are not running these tricks from lower res normally. Are there graphic tests to check this?
 
We are on lockdown mate, what else are we going to do ? :D
True. I suppose it’s entertaining to the outsider who doesn’t have an allegiance to either manufacture.

I wouldn't say seriously tbh it's a hobby and like any hobby they will be disagreements.

Any hobby in the world will have some form of disagreement.
I don’t know about that it stepped up a notch with the ‘trickery’ :p
 
Last edited by a moderator:
I think you guys needs to get out unreal engine 4 and make your own games so you can get the exact visuals you want so we can stop arguing
 
I will actually take back some of my previous comments. Version 2 IS better than version 1. At some point though, they are going to start conning us with doing upscaling without telling us, and making it look like their products are performing better than they are. Maybe version 3 will get closer to optimal.
 
I will actually take back some of my previous comments. Version 2 IS better than version 1. At some point though, they are going to start conning us with doing upscaling without telling us, and making it look like their products are performing better than they are. Maybe version 3 will get closer to optimal.

cobsoles have been doing it for years now - they use upscaling and don't mention it at all if it wasn't for digital foundry testing. And the same WILL happen on PC eventually - where it's using AI scaling regardless of the resolution you choose
 
I will actually take back some of my previous comments. Version 2 IS better than version 1. At some point though, they are going to start conning us with doing upscaling without telling us, and making it look like their products are performing better than they are. Maybe version 3 will get closer to optimal.

To be clear, they're still doing some dodgy **** with sharpening & contrast, so the difference is overblown AND in motion it's different, but as I said before, it's acceptable an trade-off for the performance. I mean, you had the case of Control where 2060 at 4K wouldn't even get above 8 fps, but now it's 36+ (with full RT)? That speaks for itself.

For the upscaling shenanigans, they're kinda already doing that more and more. If you look at games coming out you see a lot more "temporal injection" and the like, and in many cases you can't even choose AA anymore (Hello Doom E) and it's selected by default to a TAA of sorts with various frame accumulations (at lower res) and the like (most Ubi games will be like this, and in ways they already are; Breakpoint, Odyssey etc). There's a lot of dodgy **** going on and it's gonna be ubiquitous in the next 2-3 years (remember, there's always a lag for games because they take so long to make). And ofc it's never actually made explicit, so you can only find out by paying attention & knowing what to look for or if they do a GDC talk later on.

DLSS, as cool as it is now, it still doesn't mean jack if they only put it in 6 months after release (maybe) and only in random ****** games they manage to convince the devs for. (sorry YB fans, I appreciate all 3 of you)
 
DLSS, as cool as it is now, it still doesn't mean jack if they only put it in 6 months after release (maybe) and only in random ****** games they manage to convince the devs for. (sorry YB fans, I appreciate all 3 of you)
The major advantage of the new method is that it only requires minimal dev input and will be implemented through game ready drivers. They have also removed some of the constraints so it can be used with more source and destination resolutions. DLSS 2.0 looks promising and should finally be ready for prime time with the 3000-series GPUs. If that is, it really is as easy to add support as Nvidia are claiming.
 
Last edited:
The major advantage of the new method is that it only requires minimal dev input and will be implemented through game ready drivers. They have also removed some of the constraints so it can be used with more source and destination resolutions. DLSS 2.0 looks promising and should finally be ready for prime time with the 3000-series GPUs. If that is, it really is as easy to add support as Nvidia are claiming.

that's exactly right, the latest games they got DLSS 2.0 was done in the latest drivers, no game updates are required it's all handled by the driver now
 
The major advantage of the new method is that it only requires minimal dev input and will be implemented through game ready drivers. They have also removed some of the constraints so it can be used with more source and destination resolutions. DLSS 2.0 looks promising and should finally be ready for prime time with the 3000-series GPUs. If that is, it really is as easy to add support as Nvidia are claiming.

Not true, check out the talk I posted above. The advantage of 2.0 instead of 1.9 is that they don't have to train the algorithm per-game and instead is a generalised one. They still have to have the devs put the work in & the game also has to be able to take advantage of DLSS.

Proof is in the pudding, and right now that's still out of stock.
 
I really never understood the point of FXAA - I assume as it requires less GPU resources it was developed for consoles?? The softness wouldn't be so noticeable on a TV many feet away?
FXAA is a relic of another era, when hardware was much less powerful. It was created midway through the PS3/360 era as a way to tease out some more performance from those aging GPUs, then inevitably made its way to PC too. It was actually created by an Nvidia employee named Timothy Lottes, though was platform-agnostic. Developers loved it at the time as it had essentially zero performance impact, looked better than MLAA and was far more acceptable when most were gaming at sub-1080p anyway (especially on consoles), as the image was already fairly blurry. To be clear, even back then it was agreed that it looked worse than MSAA and created additional blur, but the trade-off was largely considered worth it for the massive performance gains. Games like Skyrim and Duke Nukem Forever were amongst the first to implement it, to give it its proper place in time. It's just no longer really necessary on modern hardware, and with more advanced performant anti-aliasing techniques such as SMAA and TAA available, but lingers around anyway because it's cheap and easy to implement.
 

For the upscaling shenanigans, they're kinda already doing that more and more. If you look at games coming out you see a lot more "temporal injection" and the like, and in many cases you can't even choose AA anymore (Hello Doom E) and it's selected by default to a TAA of sorts with various frame accumulations (at lower res) and the like (most Ubi games will be like this, and in ways they already are; Breakpoint, Odyssey etc).

DLSS, as cool as it is now

Precisely what I mentioned. It's been happening for sometime on consoles and now more and more PC games are doing it too.

Almost every new PC game I've played this year only has this as an option:

Anti Aliasing: High, Medium, Low, Off. Where Off usually looks awful if you are playing at under Native 4k

That's it, no choice on type at all - and usually it's TAA High, Medium, Low.

So now both AMD and Nvidia have introduced sharpening tools, but are annoying to use - and neither provide any performance benefit unless you go the playstation 4 route of dropping your internal resolution.

DLSS however is easier to use in that it's in the game menus and it's a simple on/off, no messing around with driver configurations (and Nvidia's control panel sucks).
 
Last edited:
As someone to railed on nvidia about the uselessness of DLSS 1.0, they’ve done an amazing job with version 2.0

this coupled with ampere bump in RT processing could make RT a mainstay going forward. Really well done by nvidia here.
 
Not true, check out the talk I posted above. The advantage of 2.0 instead of 1.9 is that they don't have to train the algorithm per-game and instead is a generalised one. They still have to have the devs put the work in & the game also has to be able to take advantage of DLSS.

Proof is in the pudding, and right now that's still out of stock.
The talk states that integration is not very invasive and is pretty easy to implement. Sure some development is required, so that DLSS is integrated at the correct point in the render pipeline, but this is minimal compared to the previous version. Nvidia claim this is quite similar to how TAA is already handled by developers. This integration work has already be done for Unreal Engine 4. The heavy AI work is then done by Nvidia and delivered through driver releases. So gamers should not need to wait 6-months for a post-release patch from the devs as you stated.

There will probably still be the usual political nonsense in which an AMD sponsored title may not have DLSS support at launch.
 
Last edited:
The talk states that integration is not very invasive and is pretty easy to implement.

So then where are all the DLSS 2.0 titles? What are they waiting for? Not even the previous games that do have DLSS got it.

So gamers should not need to wait 6-months for a post-release patch from the devs as you stated.

Should is a dirty, dirty word. Let's see what actually happens because for now their efforts show little promise. I trashed DLSS on release, and rightly so, but I'm okay with saying it's a decent option now to be used when available. I'll change my tune further once they demonstrate that they don't just lie through their teeth about game support. Remember this? I remember.

Y106gY9.jpg
 
Back
Top Bottom