• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia DLSS 5 months on-a win or fail?

Way worse than FXAA and Nvidia felt it was ok to release this... Goes to show how low they think of the pc gaming community.

Happy to see them getting all this negative press. They got D.P. working overtime on here :p:D


HaHaHa :D So if I got this right, basically you set ray tracing on in 4k, it automatically enables DLSS to down scale image quality but your so blown away by the reflections and lighting from ray tracing and the 60FPS on your 100hz monitor they thought no one would notice the down scaled blurred image quality :D

"But it just Works" lol
 
All look, you are posting more sources that back up exactly what I am saying. Thank you for proving my argument for me!
So game devs send nvidia the compiled executable that nvidia runs, and all done before release. Exactly as I said it operated.
It is almost as if I am entirely accurate and you were wrong

All the gimpworks are almost exclusively done with access to the source code. Nvidia ALREADY has access to source code for various other reasons, so why you think the special build would be something else is unclear to me.

https://www.extremetech.com/gaming/...elopers-weigh-in-on-the-gameworks-controversy
 
You're conflating game works and DLSS there,.although it's humourous you use an article that said Nvidia would rather do things in the drivers than have access to the source code to prove your point :p

The question is does nVdidia NEED access to source for DLSS to work, not do they already have it because of some gameworks feature. I've not seen anything suggesting they need it yet but regardless of that, this is just getting messy now. That last article said AMD prefer access to source more than Nvidia so I'm not sure what point you are trying to prove anymore.

Does Nvidia need the source? Sure, sometimes. Does it matter? **** No, it couldn't be any less important.
 
You're conflating game works and DLSS there,.although it's humourous you use an article that said Nvidia would rather do things in the drivers than have access to the source code to prove your point :p

The question is does nVdidia NEED access to source for DLSS to work, not do they already have it because of some gameworks feature. I've not seen anything suggesting they need it yet but regardless of that, this is just getting messy now. That last article said AMD prefer access to source more than Nvidia so I'm not sure what point you are trying to prove anymore.

Does Nvidia need the source? Sure, sometimes. Does it matter? **** No, it couldn't be any less important.

I'm not conflating anything, the point is access to source code isn't out of the ordinary and happens regularly especially in games with gimpworks, which all RTX titles have as well.

As for AMD etc. I never mentioned any of that, so I don't know who you're arguing with.

The point was simple - devs give source code to Nvidia (for DLSS & not only). D.P. thinks that's "misinformation". It isn't.
 
I'm not conflating anything, the point is access to source code isn't out of the ordinary and happens regularly especially in games with gimpworks, which all RTX titles have as well.

As for AMD etc. I never mentioned any of that, so I don't know who you're arguing with.

The point was simple - devs give source code to Nvidia (for DLSS & not only). D.P. thinks that's "misinformation". It isn't.

No, you clearly are. Nasher made the statement that devs HAVE to send the source to nVidia, that's what D.P replied to. Stating Nvidia can get source code for gameworks does not mean it's required for DLSS. That's a textbook example of conflation.

Nothing I've seen posted so far confirms that access to the source code is required for DLSS to work. Though still i dont know why it's important.
 
yea, stick a fork in it, its dead.

RTX on the other hand isn't entirely garbage, it just hasn't got the hardware to back it up (yet)
 
of course its worse than FXAA

4k FXAA is still 4k rendering for the underlying pixels.
4k DLSS is rendered at lower resolution, so because its not 4k in the first place its going to be a worse image, if sharpness is the criteria.
 
Doesn't look too good to be honest and DLSS had such promise. I'm sure it will come good in the end, with AMD and Microsoft getting on board with machine learning aswell eventually, but for this first iteration from NVIDIA, things need to improve.
Things may still come good, but I'm not holding my hopes up. Another thing is lots of people here seem to be blaming NVIDIA for metro, but is it really totally their fault or should the developer take healthy part of the burden as well, after all they have released the game in this state?
 
I feel massively let down by NVIDIA, i was a little bit salty about the RTX Implimentations but forcing RTX to use DLSS which means your performance is lower (for a minor RTX gain) but the textures are going to absolutely garbage.

I get about 90-100fps in BF V with RTX On (low and low items) with DLSS on at QHD Ultra. With RTX Off and no DLSS It's 120+ and looks vastly superior.

NVIDIA totally and utterly lied and mis-sold the RTX Series / DLSS. You can upscale with the game and get the same performance and vastly superior texture resolution.

Honestly, it looks soapy as hell, a bit like a 4K TV with all the garbage enabled by default, it was actually very trippy to play BFV with DLSS on, it looked like I was playing on low/medium.

https://www.youtube.com/watch?v=Mrixi27G9yM

"Super high quality sample" "this is 4k DLSS, that's just perfect", "could then in real time, enhance images" "this is infiltrater running on 1 gpu at 78fps at 4k at quality that's never been before"

Shame this isn't actually what consumers get.


So TLDR, You need to enable RTX to get DLSS (which is a big performance drop) you get far worse textures and image quality. It's better to upscale a lower resolution with RTX on and DLSS off for better performance and quality if you want RTX features, for those after frame rates, you wouldn't use either. They openly admit that DLSS isn't available for super high framerates because it will actually lower performance!!! Facepalm!

If you own a RTX 2080 Ti with a QHD Panel, you can't even use DLSS anyway.


Yeha nvidia never stated that to get DLSS you needed too have raytracing enabled as well, that somehow was never mentioned during Jensens 2 hour "it just works" stage circlejerk.
 
Doesn't look too good to be honest and DLSS had such promise. I'm sure it will come good in the end, with AMD and Microsoft getting on board with machine learning aswell eventually, but for this first iteration from NVIDIA, things need to improve.
Things may still come good, but I'm not holding my hopes up. Another thing is lots of people here seem to be blaming NVIDIA for metro, but is it really totally their fault or should the developer take healthy part of the burden as well, after all they have released the game in this state?

Vega 7 is looking good, Least from these Linux results.

The test is done on a system with

  • AMD Vega FE*2

  • AMD Radeon VII

  • ubuntu 18.04 with kernel 4.18

  • ROCm 2.1

  • Tensorflow 1.12
https://www.reddit.com/r/Amd/comments/asdyon/radeon_vii_tensorflow_deep_learning_results_huge/
 
Vega 7 is looking good, Least from these Linux results.

The test is done on a system with

  • AMD Vega FE*2

  • AMD Radeon VII

  • ubuntu 18.04 with kernel 4.18

  • ROCm 2.1

  • Tensorflow 1.12
https://www.reddit.com/r/Amd/comments/asdyon/radeon_vii_tensorflow_deep_learning_results_huge/

The OP posts this later in the thread.
Talking to the creator of the script, he kindly provided me with some numbers from V100, this is a quote:

Synthetic results ResNet50 V1.0. which is resnet50 in the benchmark code.

V100 FP16 892.76 (batch-size 256) 828.69 (batch-size 128)

With XLA turned on batch-size 256: 1452.96

I tried to run FP16, resnet50 with batch-size 256,

R7: 389 images/sec (batch-size 256), 382 images/sec (batch-size 128)

Vega FE: 178 images/sec (batch-size 256), 191 images/sec (batch-size 128)
Interested to see what the more consumer orientated RTX cards score as they also have (slightly knobbled) tensor units.
 
Another thing is lots of people here seem to be blaming NVIDIA for metro, but is it really totally their fault or should the developer take healthy part of the burden as well, after all they have released the game in this state?

Who does Nv actually pay to include RTX in metro, is it 4A, Deepsilver or both as the game devs surely can't get the blame as RTX inclusion would have came from above?
 
Who does Nv actually pay to include RTX in metro, is it 4A, Deepsilver or both as the game devs surely can't get the blame as RTX inclusion would have came from above?

Do NVidia pay to have RTX included?

Is this a know fact or is it just unsubstantiated rumour?

I've had a quick look but as of yet found nothing to corroborate it either way.

Personally I don't think NVidia or AMD for that matter have that much say in the titles they put their name too, it is down to the developing studio, or maybe the parent company in the case of companies like EA.
 
Do NVidia pay to have RTX included?

Is this a know fact or is it just unsubstantiated rumour?

I've had a quick look but as of yet found nothing to corroborate it either way.

Personally I don't think NVidia or AMD for that matter have that much say in the titles they put their name too, it is down to the developing studio, or maybe the parent company in the case of companies like EA.

AMD-EA/Rebellion/Eidos/Ubisoft/Bethesda and some from Nv-Epic/Ubisoft/Eidos/CDP/EA/Bethesda/Deep Silver, do you think AMD/Nv DON'T pay these Publishers to include their tech?

You honestly think EA said to AMD, 'sure Dice can help build Mantle for free, although that's going to take up resources and game release target, sure why not, in fact EA are that happy about it all, we'll give you 200K free games to give away with your next big gpu if that's all right with you...'?

If you don't agree, then fine it's still an substantiated rumour to you, but for me, 100% Publishers get paid in these partnerships when free codes are sent out with new gpus, it's all part of the game.
 
Paying for game codes, or Dice helping AMD with Mantle are definitely things, that I agree would require some form of payment. But that in itself is a far cry from just saying that NVidia pay developers to include RTX.
Maybe they do, maybe they don't, I don't know.

I know one thing for certain, if Humbug was to make a full game out of the levels he was messing about with while toying with different physics systems and he decided to do it with ray tracing, there is now way in hell he could say to NVidia, "hey guys pay me to use your RTX stuff."

Anybody can write a game using Microsoft's DXR which works with NVidia RTX, so I'm not at all convinced that NVidia pay to include RTX. Maybe they did with launch titles, who knows.
 
Paying for game codes, or Dice helping AMD with Mantle are definitely things, that I agree would require some form of payment. But that in itself is a far cry from just saying that NVidia pay developers to include RTX.
Maybe they do, maybe they don't, I don't know.

I know one thing for certain, if Humbug was to make a full game out of the levels he was messing about with while toying with different physics systems and he decided to do it with ray tracing, there is now way in hell he could say to NVidia, "hey guys pay me to use your RTX stuff."

Anybody can write a game using Microsoft's DXR which works with NVidia RTX, so I'm not at all convinced that NVidia pay to include RTX. Maybe they did with launch titles, who knows.

RTX userbase is probably <1%, which publishers are going to absorb the coding workload cost into their title for free?

Anyone else in here think RTX support isn't Nv bankrolled-goes the same for AMD titles too?
 
FFXV managed to look better with DLSS. I think metro and BFV needed more time in the super computer oven tbh. I feel DLSS in the two games was rushed to market?

With FFV we know it works, so maybe DLSS will get better with times and patches?
 
Back
Top Bottom