1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

really getting fed up with the posts stating RTX/DLSS does not work this gen

Discussion in 'Graphics Cards' started by sunshinewelly, Aug 16, 2019.

  1. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,581

    Location: Planet Earth

    One of images looks like it has more sharpening applied. Its why unsharp masks works when I develop images from RAW. The "native" image looks unusually soft.

    That explains it then. Its not the first or last time AMD or Nvidia have done this. For example PhysX on and PhysX off,when the latter had normal physics animations removed from some games,to emphasise the difference. Then you have had earlier games such as Red Faction which could do insane physics on old CPUs. Same with TressFX and Hairworks,software based hair animation techniques existed,but look at how flat the hair looks in games promoting such technology,without TressFX/Hairworks switched on.
     
    Last edited: Mar 29, 2020
  2. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 14,619

    Location: London

    That is why he never tricked anyone as we all said the image looked blurry, which it would as he applied fxaa to it. Lol. If anything he proved me right, as those were my only words about the image in question :D
     
  3. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,581

    Location: Planet Earth

    I really never understood the point of FXAA - I assume as it requires less GPU resources it was developed for consoles?? The softness wouldn't be so noticeable on a TV many feet away?
     
  4. Scougar

    Capodecina

    Joined: Jan 30, 2007

    Posts: 12,449

    Location: PA, USA

    Was surprised about the artifacts then.

    Explains why detail was missing on things like the federal logo etc.

    Image looked better than I expected. The other images you showed looked "challenged", when not knowing when to apply things like focus at different levels.

    Beginning to wonder how we know if manufacturers are not running these tricks from lower res normally. Are there graphic tests to check this?
     
  5. TonTom

    Wise Guy

    Joined: Apr 1, 2018

    Posts: 1,029

    True. I suppose it’s entertaining to the outsider who doesn’t have an allegiance to either manufacture.

    I don’t know about that it stepped up a notch with the ‘trickery’ :p
     
    Last edited by a moderator: Mar 30, 2020
  6. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 3,999



    Definitely watch the talk, it's insane how good DLSS 2.0 can be!
     
    Last edited: Apr 3, 2020
  7. Grim5

    Mobster

    Joined: Feb 6, 2019

    Posts: 3,380

    I think you guys needs to get out unreal engine 4 and make your own games so you can get the exact visuals you want so we can stop arguing
     
  8. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 14,619

    Location: London

    Cool. Will check it out. Hopefully it lives up to the hype this time as it will come in handy when playing Cyberpunk 2077 :D
     
  9. Scougar

    Capodecina

    Joined: Jan 30, 2007

    Posts: 12,449

    Location: PA, USA

    I will actually take back some of my previous comments. Version 2 IS better than version 1. At some point though, they are going to start conning us with doing upscaling without telling us, and making it look like their products are performing better than they are. Maybe version 3 will get closer to optimal.
     
  10. Grim5

    Mobster

    Joined: Feb 6, 2019

    Posts: 3,380

    cobsoles have been doing it for years now - they use upscaling and don't mention it at all if it wasn't for digital foundry testing. And the same WILL happen on PC eventually - where it's using AI scaling regardless of the resolution you choose
     
  11. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 3,999

    To be clear, they're still doing some dodgy **** with sharpening & contrast, so the difference is overblown AND in motion it's different, but as I said before, it's acceptable an trade-off for the performance. I mean, you had the case of Control where 2060 at 4K wouldn't even get above 8 fps, but now it's 36+ (with full RT)? That speaks for itself.

    For the upscaling shenanigans, they're kinda already doing that more and more. If you look at games coming out you see a lot more "temporal injection" and the like, and in many cases you can't even choose AA anymore (Hello Doom E) and it's selected by default to a TAA of sorts with various frame accumulations (at lower res) and the like (most Ubi games will be like this, and in ways they already are; Breakpoint, Odyssey etc). There's a lot of dodgy **** going on and it's gonna be ubiquitous in the next 2-3 years (remember, there's always a lag for games because they take so long to make). And ofc it's never actually made explicit, so you can only find out by paying attention & knowing what to look for or if they do a GDC talk later on.

    DLSS, as cool as it is now, it still doesn't mean jack if they only put it in 6 months after release (maybe) and only in random ****** games they manage to convince the devs for. (sorry YB fans, I appreciate all 3 of you)
     
  12. IT Troll

    Wise Guy

    Joined: Jun 15, 2005

    Posts: 2,451

    Location: Edinburgh

    The major advantage of the new method is that it only requires minimal dev input and will be implemented through game ready drivers. They have also removed some of the constraints so it can be used with more source and destination resolutions. DLSS 2.0 looks promising and should finally be ready for prime time with the 3000-series GPUs. If that is, it really is as easy to add support as Nvidia are claiming.
     
    Last edited: Apr 4, 2020
  13. Grim5

    Mobster

    Joined: Feb 6, 2019

    Posts: 3,380

    that's exactly right, the latest games they got DLSS 2.0 was done in the latest drivers, no game updates are required it's all handled by the driver now
     
  14. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 3,999

    Not true, check out the talk I posted above. The advantage of 2.0 instead of 1.9 is that they don't have to train the algorithm per-game and instead is a generalised one. They still have to have the devs put the work in & the game also has to be able to take advantage of DLSS.

    Proof is in the pudding, and right now that's still out of stock.
     
  15. Aretak

    Wise Guy

    Joined: May 26, 2014

    Posts: 1,932

    FXAA is a relic of another era, when hardware was much less powerful. It was created midway through the PS3/360 era as a way to tease out some more performance from those aging GPUs, then inevitably made its way to PC too. It was actually created by an Nvidia employee named Timothy Lottes, though was platform-agnostic. Developers loved it at the time as it had essentially zero performance impact, looked better than MLAA and was far more acceptable when most were gaming at sub-1080p anyway (especially on consoles), as the image was already fairly blurry. To be clear, even back then it was agreed that it looked worse than MSAA and created additional blur, but the trade-off was largely considered worth it for the massive performance gains. Games like Skyrim and Duke Nukem Forever were amongst the first to implement it, to give it its proper place in time. It's just no longer really necessary on modern hardware, and with more advanced performant anti-aliasing techniques such as SMAA and TAA available, but lingers around anyway because it's cheap and easy to implement.
     
  16. Grim5

    Mobster

    Joined: Feb 6, 2019

    Posts: 3,380



    Precisely what I mentioned. It's been happening for sometime on consoles and now more and more PC games are doing it too.

    Almost every new PC game I've played this year only has this as an option:

    Anti Aliasing: High, Medium, Low, Off. Where Off usually looks awful if you are playing at under Native 4k

    That's it, no choice on type at all - and usually it's TAA High, Medium, Low.

    So now both AMD and Nvidia have introduced sharpening tools, but are annoying to use - and neither provide any performance benefit unless you go the playstation 4 route of dropping your internal resolution.

    DLSS however is easier to use in that it's in the game menus and it's a simple on/off, no messing around with driver configurations (and Nvidia's control panel sucks).
     
    Last edited: Apr 4, 2020
  17. Grim5

    Mobster

    Joined: Feb 6, 2019

    Posts: 3,380

    [​IMG]



    [​IMG]
     
  18. Robert896r1

    Hitman

    Joined: Sep 28, 2018

    Posts: 764

    As someone to railed on nvidia about the uselessness of DLSS 1.0, they’ve done an amazing job with version 2.0

    this coupled with ampere bump in RT processing could make RT a mainstay going forward. Really well done by nvidia here.
     
  19. IT Troll

    Wise Guy

    Joined: Jun 15, 2005

    Posts: 2,451

    Location: Edinburgh

    The talk states that integration is not very invasive and is pretty easy to implement. Sure some development is required, so that DLSS is integrated at the correct point in the render pipeline, but this is minimal compared to the previous version. Nvidia claim this is quite similar to how TAA is already handled by developers. This integration work has already be done for Unreal Engine 4. The heavy AI work is then done by Nvidia and delivered through driver releases. So gamers should not need to wait 6-months for a post-release patch from the devs as you stated.

    There will probably still be the usual political nonsense in which an AMD sponsored title may not have DLSS support at launch.
     
    Last edited: Apr 6, 2020
  20. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 3,999

    So then where are all the DLSS 2.0 titles? What are they waiting for? Not even the previous games that do have DLSS got it.

    Should is a dirty, dirty word. Let's see what actually happens because for now their efforts show little promise. I trashed DLSS on release, and rightly so, but I'm okay with saying it's a decent option now to be used when available. I'll change my tune further once they demonstrate that they don't just lie through their teeth about game support. Remember this? I remember.

    [​IMG]