• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

PS3 doesn't do any image reconstruction, poor attempt at trolling.

On a more constructive note - the brilliant mind of Mark Cerny is clearly hard at work, as Sony files a patent for a machine learning image reconstruction technique.

http://www.freepatentsonline.com/WO2020148810A1.html

"An information processing device for acquiring a plurality of reference images obtained by imaging an object that is to be reproduced, acquiring a plurality of converted images obtained by enlarging or shrinking each of the plurality of reference images, executing machine learning using a plurality of images to be learned, as teaching data, that include the plurality of converted images, and generating pre-learned data that is used for generating a reproduction image that represents the appearance of the object."

I do hope the PS5 eventually gets something more comparable to the greatness of DLSS 2.0
IT'S here https://www.youtube.com/watch?v=YayktAvg2oo
 
From a third party developer I heard that DLSS 2.0 is currently in beta release period for developers and is not just something you can integrate if you want to. You need NV approval and, if I am not mistaken, they also need to assist the integration. How and whenit will leave this beta period, I am not sure. I would imagine as soon as they think the quality and integration documentation is universally good.
- Alex from DF

Disappointing but expected, it remains mostly as a marketing tool.
 
- Alex from DF

Disappointing but expected, it remains mostly as a marketing tool.
Yea, judgment by some posters here it really is working very well indeed.

To me it will get very interesting when integration becomes easy enough that most games release with it. That said, it is meant to me in all the key titles I am interested in that are coming out this year and next so that is enough for me.
 
Yea, judgment by some posters here it really is working very well indeed.

To me it will get very interesting when integration becomes easy enough that most games release with it. That said, it is meant to me in all the key titles I am interested in that are coming out this year and next so that is enough for me.

That's why DLSS 3.0 looks so interesting. The rumours are that it'll just hook any game that uses TAA - so... every game. Hopefully make it useful to the mass market
 
That's why DLSS 3.0 looks so interesting. The rumours are that it'll just hook any game that uses TAA - so... every game. Hopefully make it useful to the mass market

From the DF videos it looks like DLSS upto 2 still depends heavily on having information on movement vectors, etc. to work nicely which is something that needs to be looked out for in any comparison.

Personally I'd still rather just use the hardware to do advanced anti-aliasing without the performance penalty that usually incurs - the only place I'd accept DLSS is if it enables a proper implementation of ray tracing at good framerates.
 
That's why DLSS 3.0 looks so interesting. The rumours are that it'll just hook any game that uses TAA - so... every game. Hopefully make it useful to the mass market
Let’s hope so. Question is will it be as good as DLSS 2.0? Wonder how they will make it work. Interesting a few months ahead :)
 
From a third party developer I heard that DLSS 2.0 is currently in beta release period for developers and is not just something you can integrate if you want to. You need NV approval and, if I am not mistaken, they also need to assist the integration. How and whenit will leave this beta period, I am not sure. I would imagine as soon as they think the quality and integration documentation is universally good.
- Alex from DF
I doubt that 3.0 will be any different. Who started the rumor that it was automatic??
 
DLSS 3.0 will reportedly "work in any game with TAA" but it will require a Game Ready driver to do so, meaning developers will have to do some "specific programming per game to get it to work, but it should be easier than before".
...
results vary wildly between games, but the overall quality is higher than before". An interesting note is that "some evidence NVIDIA will be turning DLSS 3.0 on by default, possibly overriding settings in some games and pushing benchmarking sites to use it in comparisons with RDNA 2".

https://www.tweaktown.com/news/7388...uld-work-on-any-game-that-uses-taa/index.html

Well, in the article they clearly imply the same steps needed to incorporate DLSS 3.0 as what Alex from DF stated required for DLSS 2.0.

Also, looks like we are going back to reduce IQ for more FPS to show larger margin of FPS against RDNA 2. You read it straight from his mouth. So lets not become goldfish and forget these tactics shell we.

And every game released in the Ampere benchmark review will use TAA. Every last one of them. Even if they add it to the game that never had it. And whatever else Nvidia will do to "override IQ settings" in those games to make sure their FPS numbers are up. Cat is out the bag but I doubt some of them will tell you or label the graphs to tell you that its on.

Dodgy as all get out!
 
Last edited:
Lol. You get people hellbent on selling DLSS, then you get ones like eastcoast hellbent on dismissing it and everything that is nvidia :p

End of the day I will judge it when it comes out. As I said way back as last year, it does not matter what they do to achieve said image quality, all that matters is the end result.
 
https://www.tweaktown.com/news/7388...uld-work-on-any-game-that-uses-taa/index.html

Well, in the article they clearly imply the same steps needed to incorporate DLSS 3.0 as what Alex from DF stated required for DLSS 2.0.

Also, looks like we are going back to reduce IQ for more FPS to show larger margin of FPS against RDNA 2. You read it straight from his mouth. So lets not become goldfish and forget these tactics shell we.

And every game released in the Ampere benchmark review will use TAA. Every last one of them. Even if they add it to the game that never had it. And whatever else Nvidia will do to "override IQ settings" in those games to make sure their FPS numbers are up. Cat is out the bag but I doubt some of them will tell you or label the graphs to tell you that its on.

Dodgy as all get out!

I've said this for the very beginning when talking about Nvidia and DLSS. They will push this, incorrect IMHO, narrative of "better than native" to justify forcing it on in order to win benchmarks either when they would have otherwise lost or to make it look like a landslide. FFing disgusting really and if this is what really happens I will certainly not support it and instead give someone else my money when its time to upgrade. Will be keeping an eye on the tech press to see how many picks this up but I wouldn't be surprised if many of the smaller channels falls for this stunt.
 
What a blanket statement. Try shake that magic 8 ball of yours again, I think it is defective.
Cracks me up when people say that. They just cannot possibility conceive of anything else.

I mean I am 99% sure I will be going Nvidia for my next card, yet I will stay objective and not lap up everything and go preaching the good word of Jensen :D
 
Lol. You get people hellbent on selling DLSS, then you get ones like eastcoast hellbent on dismissing it and everything that is nvidia :p

End of the day I will judge it when it comes out. As I said way back as last year, it does not matter what they do to achieve said image quality, all that matters is the end result.
You cannot refute the article so you accuse me of being bent on dismissing Nvidia. Ok, you got me I'm the author of the article.:rolleyes:

But having read some of your post history it wouldn't matter what Nvidia did you will still buy them. So that's not a surprise to me that you reaffirm that in your post.:p

I've said this for the very beginning when talking about Nvidia and DLSS. They will push this, incorrect IMHO, narrative of "better than native" to justify forcing it on in order to win benchmarks either when they would have otherwise lost or to make it look like a landslide. FFing disgusting really and if this is what really happens I will certainly not support it and instead give someone else my money when its time to upgrade. Will be keeping an eye on the tech press to see how many picks this up but I wouldn't be surprised if many of the smaller channels falls for this stunt.

Well it's nothing new coming from them. I'm just surprised that he whistle blew them so early. I'm a bit surprised that this didn't get more traction.
 
Last edited:
You cannot refute the article so you accuse me of being bent on dismissing Nvidia. Ok, you got me I'm the author of the article.:rolleyes:

But having read some of your post history it wouldn't matter what Nvidia did you will still buy them. So that's not a surprise to me that you reaffirm that in your post.:p



Well it's nothing new coming from them. I'm just surprised that he whistle blew them so early. I'm a bit surprised that this didn't get more traction.
I will buy it because I have an OLED TV that is G-Sync and my 4K monitor that is also G-Sync. I will also buy it because all the games I look forward to most will have RT and DLSS.

I have had more ATI/AMD GPU’s than nvidia, also had many more AMD CPU’s than intel. So no one can ever accuse me of being loyal to any one company or being against AMD etc.

I buy what I fancy. Have zero loyalty to either company, this usually means I can get attacked by both hardcore AMD and Nvidia fans. Does not bother me though, end of the day this is just a hobby and I enjoy coming here to discuss things with fellow enthusiasts. Sometimes it can get a little heated, but I am sure no one here takes it seriously, I know I don’t :D
 
I will buy it because I have an OLED TV that is G-Sync and my 4K monitor that is also G-Sync. I will also buy it because all the games I look forward to most will have RT and DLSS.

I have had more ATI/AMD GPU’s than nvidia, also had many more AMD CPU’s than intel. So no one can ever accuse me of being loyal to any one company or being against AMD etc.

I buy what I fancy. Have zero loyalty to either company, this usually means I can get attacked by both hardcore AMD and Nvidia fans. Does not bother me though, end of the day this is just a hobby and I enjoy coming here to discuss things with fellow enthusiasts. Sometimes it can get a little heated, but I am sure no one here takes it seriously, I know I don’t :D
that's sssssmokin :D

But one thing is certain. As a hobbyist with no loyalty one shouldn't become upset enough to make accusations towards another hobbyist over concerns of certain practices that's revealed in a article. As you say, we come here for the discussion with fellow enthusiasts... :D
 
Last edited:
Back
Top Bottom