• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Sadly not mate, looks pretty awful in the brand new Watch Dogs Legion:

https://www.reddit.com/r/nvidia/comments/jn06hi/watch_dogs_legions_implementation_of_nvidias_dlss/

jyyng3mx7xw51.png

Those two images are basically the same.




Upscaling will be worse, even with sharping.
 
Last edited:
Those two images are basically the same.



Upscaling will be worse, even with sharping.

DLSS supposed to look sharper at the higher resolution... It looks worse. Look at the hair on the neckline, big mistakes made. The boat textures also look funky.

Amazing how much loyalty people show towards NVIDIA, even when they mess up a DLSS implementation....
 
if you want qaulity eg 4K then you dont ever consider AI upscaling or sharpening it defeats the entire purpose of going 4K. if you play at 1440p but wanted to try 4K then try DSR
to me DLSS is pointless
 
DLSS supposed to look sharper at the higher resolution... It looks worse. Look at the hair on the neckline, big mistakes made. The boat textures also look funky.

Amazing how much loyalty people show towards NVIDIA, even when they mess up a DLSS implementation....

That's not Nvidia tho, but the devs of the game.
 
if you want qaulity eg 4K then you dont ever consider AI upscaling or sharpening it defeats the entire purpose of going 4K. if you play at 1440p but wanted to try 4K then try DSR
to me DLSS is pointless

Good luck playing 4k with all the ray tracing turned on. Control RT and no dlss you need a 3080 or 3090 for 1440p @60fps. https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3090-review?page=5 Crysis remastered is the same 40fps 4k with RT on 3090. https://www.tomshardware.com/uk/news/nvidia-geforce-rtx-3090-review
 
if you want qaulity eg 4K then you dont ever consider AI upscaling or sharpening it defeats the entire purpose of going 4K. if you play at 1440p but wanted to try 4K then try DSR
to me DLSS is pointless
That makes sense at higher resolutions for sure.
 
Good luck playing 4k with all the ray tracing turned on. Control RT and no dlss you need a 3080 or 3090 for 1440p @60fps. https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3090-review?page=5 Crysis remastered is the same 40fps 4k with RT on 3090. https://www.tomshardware.com/uk/news/nvidia-geforce-rtx-3090-review
i wont need luck as i wont be using it.

i don't want to use some dumb down resolution. i will be using native 4K and forgo RT until it works at well.
4K dlss RT will most likely look worse than 4K native RT off anyway

i have every native resolution screen at hand would rather game at 4K native than 1440P with RT 4K generally looks far sharper and more detailed even with out any RT
 
DLSS supposed to look sharper at the higher resolution... It looks worse. Look at the hair on the neckline, big mistakes made. The boat textures also look funky.

Amazing how much loyalty people show towards NVIDIA, even when they mess up a DLSS implementation....

I'm not going to defend how that looks. It looks gash. However, don't forget that it could have been implemented ages ago, when DLSS was still awful. All this does? is expose the issue with it, which has always been the biggest boil on PC gaming's arse. It.needs.coding.for. Extra coding, at a cost to the developer. Here we can clearly see the cheap half arsed attempt.

I love DLSS when it works well. I do, I think it is amazing. However, it's just *another* one of Nvidia's odd ball techs that needs developing properly for. Something which has never happened ever. Right now? oh yeah they're on it. Just like they were on SLi and Physx and 3dvision too. Look how that ended.

For years and years AMD have desperately been trying to use their console leverage to tie up gaming. What I mean is, techs that will go in consoles that will give AMD a lift up on PC also. Because all of the features being coded for on those consoles? will be a given on a Radeon. So as good as DLSS may have gotten? like baked in Gsync it will no doubt die off. Because coders will be using DirectML on the consoles, meaning they don't have to bother with Nvidia at all. Just do it once. And pay once.

It's what always happens over and over again. For now? hey, I love DLSS and I will take it. However, I don't expect it to last. It will be superseded by whatever the given DX12 ultimate "thing" turns out to be. Just like all of their other proprietary stuff.

Oh and that whole ecosystem AMD were betting on by selling cheap APUs to Microsoft and Sony of course fell back on their hardware. Which was total crap at the time. Now though? yeah I can see that idealism working quite well for them.
 
i wont need luck as i wont be using it.

i don't want to use some dumb down resolution. i will be using native 4K and forgo RT until it works at well.
4K dlss RT will most likely look worse than 4K native RT off anyway

i have every native resolution screen at hand would rather game at 4K native than 1440P with RT 4K generally looks far sharper and more detailed even with out any RT

The issue with RT and resolution is that the processing requirement sky rocket. No matter what you do no hardware exists today that can provide enough real time performance with high enough SPP to create a decent image. You have to denoise from a low SPP, which is not prefect. This leads to a less than perfect render. Then the likes of 4k and 8k you no longer can render games at high frame rates. So you get stuck at low resolutions.

The only way to fix this is to upsample but that blurs the image and creates visual artifacts with movement. Blurring and distortion. Or a ringing idiosyncrasy, which appears as a "ghost" or further outline around objects. This is AMD's method. NVidia created DLSS 2.1 which is better than normal upscaling is not immune. There are lots of adjustments for DLSS which is why Death Standing looks great. Control has issues because the image DLSS works on, has a lower ray count and this cant be hidden. Even so it looks good, very close to 4k. No upscaling will ever be perfect, you can't create all the information that is not there. DLSS can fill in some areas but cant be 100% perfect. It does most of the time look nearly as good as native and in ways better but it cant be 100% perfect. Then developers have to implement it correctly to hide the down sides. That goes for AMD and Nvidia.

Most people who attack DLSS, implement an unreasonable standard of perfection which is imposible.

If you go AMD and the rumors are true about their RT performance being lower. Then expect 1080p to be your resolution for games like Control. Godfall uses Ray Tracing to improve only the game's shadows, so 1440p. For that 16GB is a lot of vRAM. You should be up in arms about Godfall and image quality. Whats the point of good textures if you render them at 1080p/1440p and use upscaling.

Were is the native resolution quality there? FidelityFX Contrast Adaptive Sharpening is an image quality wrecker. Even worse Variable Rate Shading is another image quality wrecker.

Sure forgo RT, the perfect native render is never going to happen. It's always going to be a balance between quality and performance.
 
Last edited:
That's not Nvidia tho, but the devs of the game.
Not true, Nvidia always has their own engineers work together with the game devs to implement gameworks features. Has happened in every RTX-marketed title so far, go look up all the talks at GDC they did with them.

This game is just proving that DLSS doesn't always look good even if we don't expect it to hit native quality, which is normal. It's only an issue because the marketing & certain youtube channels that got sponsored by Nvidia tried to make it seem like it was a better feature than it was.
 
Not true, Nvidia always has their own engineers work together with the game devs to implement gameworks features. Has happened in every RTX-marketed title so far, go look up all the talks at GDC they did with them.

This game is just proving that DLSS doesn't always look good even if we don't expect it to hit native quality, which is normal. It's only an issue because the marketing & certain youtube channels that got sponsored by Nvidia tried to make it seem like it was a better feature than it was.

DLSS looks the business in Control and death stranding. It also looks good in Watch dogs legion if you look at the videos.

Have you updated control to use DLSS 2.0 yet? It’s AMAZING.
https://www.reddit.com/r/controlgam...e_you_updated_control_to_use_dlss_20_yet_its/

DLSS 2.0 is Amazing
https://www.reddit.com/r/pcgaming/comments/fpl4fn/dlss_20_is_amazing/

NVidia DLSS in 2020:Stunning Results
https://www.techspot.com/article/1992-nvidia-dlss-2020/

DLSS 2.0 makes Death Stranding look and run way better
https://www.pcgamesn.com/death-stranding/dlss-2

NVIDIA’s DLSS 2.0 tech could make your £300 graphics card perform like a £700 one
https://www.nme.com/features/nvidia...-graphics-card-perform-like-a-700-one-2708925

https://youtu.be/YWIKzRhYZm4



https://www.eurogamer.net/articles/digitalfoundry-2020-control-dlss-2-dot-zero-analysis

Just as we saw in Wolfenstein: Youngblood, the new DLSS is also capable of measuring up nicely to native resolution rendering, even if the core image is actually built up from just 25 per cent of the overall pixel count.

On the rock wall ahead of Jesse in the very first playable scene in Control, single pixel detail on reflective elements of the rock wall shine with DLSS when they don't with native rendering. Remember that DLSS is a replacement for the temporal anti-aliasing found in many games - and TAA does tend to add some level of blur that DLSS does not.

The second effect is edge treatment, where DLSS can exhibit some minor flaws, specifically on high contrast edges. Also notable is that edges can also reveal a small degree of over-sharpening, creating a halo effect that again is most noticeable on high contrast areas. The degree of sharpening from the neural network is apparently tweakable in real-time, so I would hope to see this added as another slider available to users.

Other than that small detail, DLSS 2.0 at any normal screen distance on any reasonable resolution looks almost as good as the real thing.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom