• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS 2.0: why aren’t more people raving about it?


You claimed open source as the excuse for not going with Physx. Which isn't true. It was nothing to do with been open source. They joined with the Intel owned Havok in 2008 after rejecting Nvidia.

If their focus was really open source, why use either of your biggest rivals closed solutions?

The article you linked to was a year later after that.

Bullet existed before 2008. Nvidia had demos using Bullet back in 2007.
 
once FSR/DirectML is widespread and rt is viable on amd cards, %90 of the users in this thread will do a full 180

:cry:
The thread is why are people not raving about DLSS 2.0.
Are you saying when FSR/DirectML become popular or Nvidia release DLSS 3.0 we will rave about DLSS 2.0?

Or are you saying those raving about DLSS 2.0 now will be thrashing it then?
 
Will some games support both DLSS 2.0 and AMD's FSR, and if so, will it be possible to select either on a RTX 3000 series Graphics card?

or, will it be locked to AMD graphics cards 5000 series /future graphics cards?
 
Physx was very good - until Nvidia put it behind the pay wall of owning an Nvidia card (i have an old Aegia Phyx PPU card here still). Nothing uses Physx in 2021, the last `big ` title to use it was Metro Exodus then not lot since warhammer in 2016

Nothing really uses the original hardware PhysX any more but loads of games do still use the PhysX software runtime with some effects accelerated via GPU compute and/or exist as part of FleX, etc. and some games have individual effects from PhysX implemented within the engine itself such as hair/fur rather than using an external physics library.

Cyberpunk 2077 for instance uses PhysX extensively including features accelerated on the GPU.

Sadly progress in this area has stalled significantly though with only fairly basic use of physics in games even now for the most part.
 
Last edited:
Will some games support both DLSS 2.0 and AMD's FSR, and if so, will it be possible to select either on a RTX 3000 series Graphics card?

or, will it be locked to AMD graphics cards 5000 series /future graphics cards?

You're guess is as good as anybody else's, they will probably coexist for a while with increased support from games, sponsored games will probably get one of the other, and hopefully the the open standard will eventually win out.

But before we get there AMD and Microsoft need to deliver something comparable in features and quality to DLSS 2.0, I'm doubtful they will get it right first time, similar to DLSS 1.0.
 
How does that make it a joke and a con? Nvidia developed it for their GPUs, so it's a con that other GPUs can't use it?

That's like saying Amazon made a series for Prime and it's a con that you can't watch it on Netflix

IIRC Nvidia were happy to license it to AMD but AMD refused

I know, its like AMD coming up with SAM/REBAR and then putting it behind a paywall/licencing it to Nvidia to use... oh wait.
 
I can guarantee it's not as good as DLSS 2.0. LG oled upscaling is good, but not that good. I have an Nvidia Shield TV tube attached to mine, and that does "AI" upscaling from 720p, 1080p and 1440p to 4k, and it is better when compared to just letting the TV do it.

Yep LG isn't in the same league as Nvidia AI upscaling or Samsung AI upscaling. Both of those are substantially superior in terms of results.
 
Nothing really uses the original hardware PhysX any more but loads of games do still use the PhysX software runtime with some effects accelerated via GPU compute and/or exist as part of FleX, etc. and some games have individual effects from PhysX implemented within the engine itself such as hair/fur rather than using an external physics library.

Cyberpunk 2077 for instance uses PhysX extensively including features accelerated on the GPU.

Sadly progress in this area has stalled significantly though with only fairly basic use of physics in games even now for the most part.

I know nothing uses the original PPU card anymore - its heyday was the batman games, then Nv bought the company and that was that for the PPU cards themselves , no drivers or Physx updates. Whats of interest though is that GPU Physx is not open source, but the cpu runtimes are.
 
AMD don't have the Tensor cores for AI though so FSR almost certainly isn't going to be as good / effective.
some people claim that dlss does not utilize tensor cores in any meaningful way and it is just there to create artificial segmentation to phase out older gpus from upscaling but im not sure where they get the info or whether they're right or wrong

but FSR can be really competetive, both microsoft and amd is working on it. i heard that rdna2 can do specific ai neural based calculations very fast
 
some people claim that dlss does not utilize tensor cores in any meaningful way and it is just there to create artificial segmentation to phase out older gpus from upscaling but im not sure where they get the info or whether they're right or wrong

but FSR can be really competetive, both microsoft and amd is working on it. i heard that rdna2 can do specific ai neural based calculations very fast

NVIDIA DLSS: Your Questions, Answered
 
I don't think microsoft are working with amd on FSR. It's more likely directml along with nvidia.
but it would be best interest for microsoft to make the FSR great as possible

they have an unbelievably underpowered console in series s, that already targets 720p-810p in valhalla for 60 frames. a good implementation of FSR would wonders for that machine

same for series x, it tends to do native 4k targeting 30 frames in new games, and targeting 1200-1440p for 60 frames

but again, in the same time i dont think these devs and manufacturers care about image quality at all, seeing how they slap the holy grail "temporal anti aliasing" to every game there is and call it a day by nerfing the resolution clarity to pre-ps3 era resolutions only to be negated by 4k rendering where you actually get a resemblance of 1080p clarity of 2012-2015 pre-TAA games. but that's an entirely different topic

sry, i'm really frustrated by modern TAA in general. whenever or wherever possible, i vent out about it. can't help it. i just can't.
 
Last edited:
but it would be best interest for microsoft to make the FSR great as possible

they have an unbelievably underpowered console in series s, that already targets 720p-810p in valhalla for 60 frames. a good implementation of FSR would wonders for that machine

same for series x, it tends to do native 4k targeting 30 frames in new games, and targeting 1200-1440p for 60 frames

but again, in the same time i dont think these devs and manufacturers care about image quality at all, seeing how they slap the holy grail "temporal anti aliasing" to every game there is and call it a day by nerfing the resolution clarity to pre-ps3 era resolutions only to be negated by 4k rendering where you actually get a resemblance of 1080p clarity of 2012-2015 pre-TAA games. but that's an entirely different topic

sry, i'm really frustrated by modern TAA in general. whenever or wherever possible, i vent out about it. can't help it. i just can't.

I hate taa games too that's why I'd love it if every game had dlss
 
TAA looks pretty great at 4k or above, when implemented properly. I noticed in Watch Dogs Legion, that TAA (DLSS off) can cause aliasing around lighter pixels, (like lamps etc at a distance).

Random question - I tried 4K upscaled to 7680 x 4320 in Watch Dogs Legion using DSR ('smooth' set to 100%) and ofc it looked very detailed, with no discernible aliasing on my 4K monitor. Then, I set DLSS to 'performance' mode, which apparently results in an internal resolution that is 50% of the height and width of the selected display resolution. So, for 7680 x 4320 display resolution, the internal resolution would be equal to 4K (3840 x 2160).

So, does this mean that 7680 x 4320 + DLSS performance mode is just all round higher quality than 4K display resolution (DLSS off)?

Also, presumably, 7680 x 4320 + DLSS Quality is higher quality and sharper than 4K, do to the higher internal resolution.

I suppose we would need GPUs that are about 2x as powerful (twice the pixel rate???) as the RTX 3070, and with at least twice the VRAM, to handle 7680 x 4320 + DLSS performance mode, as I could only get 30-40 FPS with textures on the lowest setting, with RT off. Not surprising, as that's ~33.1 million pixels!

Note- you may need to set texture quality low / very low to even attempt this, as the amount of VRAM on a lot of GPUs won't be sufficient otherwise.
 
Last edited:
it's neither logical nor sustainable, by the looks of gpu performance increases

the increase in performance does not correlate to the advancement/demands that games require over years, which needs you to buy the most expensive gpu there is every 2 year (1080ti, 2080ti and 3080 now)

i dont find 1440p crisp enough. it still needs 1.5x res scale for my tastes

tbh, if they dont fix the blur issue at 1440p with taa, rx 6800xt seems more plausible due to its pure rasterization power and no ray tracing focus. ray tracing kills the concept of native 4k, series x and ps5 already killed the concept of native 4k for 60 fps modes. i really look forward how things will turn out but i get discouraged everytime i hear DigitalFoundry praising TAA.
 
Last edited:
The main reason to run at 7680 x 4320 + DLSS performance mode, is that it should provide even less aliasing than 4K output resolution, with sharper details too.

The point of 7680 x 4320 + DLSS Ultra performance vs 4K output resolution is much more debatable though.

In the meantime, I can run some non DLSS games (TW: Warhammer II) at 5431 x 3055 using DSR, which is just over double the number of pixels that 4K resolution consists of, and it looks very good
 
Last edited:
some people claim that dlss does not utilize tensor cores in any meaningful way and it is just there to create artificial segmentation to phase out older gpus from upscaling but im not sure where they get the info or whether they're right or wrong

but FSR can be really competetive, both microsoft and amd is working on it. i heard that rdna2 can do specific ai neural based calculations very fast

DLSS does use the tensor cores but the main BIG heavy lifting has already been done by Nvidia's super computer.
 
Back
Top Bottom