• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

Yea, as do other games, DLSS, at least in its second implementation can be very good, no one is denying that, so i don't see the point you're making?

Well ...

The future is here, worse image quality and performance. Two for the price of one! :p

Was posted after ...

I find RT to be a bit like VR, you don't get to experience it from screenshots. Even video doesn't give the full experience due to you not having control.

Here at 9:10 you can clearly see why the area is lit as it is due to other sources of light, which just isn't apparent from the screenshot.

I do have RT you know :)

Also, you do know that Digital Foundry are know to have taken money from Nvidia to write misleading crap?

/sigh

PS: What is going on with the levels of paranoia? Digital Foundry, WCCFtech, people with some working knowledge of the industry are all just fanboys, shills and only love Nvidia? Even AMD's poster child commented how much DF liked RE Village. Then after they posted a very restrained and justified PC review that was not so favourable, claimed that DF were just following the money. Wouldn't it be simpler just to point out what the advantage AMD brings to the table in comparisson rather than all this BS?
 
This is a very good point. https://youtu.be/X0LyjJpGPhI?t=1559

Right now Nvidia are pushing hard for DLSS to be part of the mainstream reviews, of course they do because with it they can point at longer bars in charts and say "look, bigger bar, we better"

Well now AMD can do that too, so in practice they have just nullified that, so we can just get back to native rasterized performance. with all the "how much longer can we get the bars by using a resolution switch" can be paned off in to a septate section where BOTH fight it out resolution switching their way to the longest bars.
-------------

@Wrinkly The instance one of these journalist outlets takes a vendors money for the editorials they pump out to their audience they are an employee of that vendor, and i will treat them as such, in this case a marketing outlet for Nvidia.
 
This is a very good point. https://youtu.be/X0LyjJpGPhI?t=1559

Right now Nvidia are pushing hard for DLSS to be part of the mainstream reviews, of course they do because with it they can point at longer bars in charts and say "look, bigger bar, we better"

Well now AMD can do that too, so in practice they have just nullified that, so we can just get back to native rasterized performance. with all the "how much longer can we get the bars by using a resolution switch" can be paned off in to a septate section where BOTH fight it out resolution switching their way to the longest bars.
-------------

As I said originally, my reason for upgrading from a 1080Ti was raytracing. The rasterisation performance of the 1080Ti was fine for me at 1440p/60. Forbes appear to have understood where current gen is positioned within their 3080Ti review - https://www.forbes.com/sites/antony...6900-xt-which-should-you-buy/?sh=7d19ce782219

@Wrinkly The instance one of these journalist outlets takes a vendors money for the editorials they pump out to their audience they are an employee of that vendor, and i will treat them as such, in this case a marketing outlet for Nvidia.

Do you have a link to what happened?
 
@Wrinkly The instance one of these journalist outlets takes a vendors money for the editorials they pump out to their audience they are an employee of that vendor, and i will treat them as such, in this case a marketing outlet for Nvidia.

Difference is, digital foundry do extremely in depth videos with "evidence" to back up the claims, unlike 99% of people on this forum.

Also, wasn't it hardware unboxed or/and gamers nexus that got "warnings" from nvidia for saying something out of line or whatever, however, they also provide very good in depth comparisons showing the reasoning for why a lot of nvidia users will like and prefer dlss over native res. + TAA, but they must be "shills" too right :rolleyes:
 
As I said originally, my reason for upgrading from a 1080Ti was raytracing. The rasterisation performance of the 1080Ti was fine for me at 1440p/60. Forbes appear to have understood where current gen is positioned within their 3080Ti review - https://www.forbes.com/sites/antony...6900-xt-which-should-you-buy/?sh=7d19ce782219



Do you have a link to what happened?

DF themselves admitted it, they had to, it was an exclusive RTX 3080 review about a week before the NDA was lifted for everyone else, a number of the mainstream outlets then tore that review to shreds for all the misleading crap that was in it.

I could prove it to you, the evidence is a bit of a data dump and i cannot be bothered to put the time and energy into data dumping this thread, if you didn't already know about it and you don't believe me, i don't care.

Difference is, digital foundry do extremely in depth videos with "evidence" to back up the claims, unlike 99% of people on this forum.

Also, wasn't it hardware unboxed or/and gamers nexus that got "warnings" from nvidia for saying something out of line or whatever, however, they also provide very good in depth comparisons showing the reasoning for why a lot of nvidia users will like and prefer dlss over native res. + TAA, but they must be "shills" too right :rolleyes:

Any skilled person can prove a lot of half truths and lead you into believing alternatives truths, its called marketing.

Why not save me a lot of data dumping and pull up the DF RTX 3080 review and show me all your truths in that?
 
Last edited:
Difference is, digital foundry do extremely in depth videos with "evidence" to back up the claims, unlike 99% of people on this forum.

Also, wasn't it hardware unboxed or/and gamers nexus that got "warnings" from nvidia for saying something out of line or whatever, however, they also provide very good in depth comparisons showing the reasoning for why a lot of nvidia users will like and prefer dlss over native res. + TAA, but they must be "shills" too right :rolleyes:

HUB were told off by Nvidia for constantly saying DLSS sucked. It was only once DLSS 2.0 was introduced that HUB said it was now worth it. The problem I have with DLSS is not form a tehcincal level but from an adoption level and that it is proprietary and limited to RTX GPUs only.

So if FSR is DLSS 1.5 or better but gains far more adoption rates I will be happy.
 
Last edited:
I could prove it to you, the evidence it a bit of a data dump and i cannot be bothered to put the time and energy into data dumping this thread, if you didn't already know about it and you don't believe me, i don't care.

I beleive you, I've just never seen the article. I've noticed people mention this before.
 
I could give you a link but it is easily found on Google. Essentially they were implying the RTX 3080 was ~80% faster than 2080 at 4K and many then used that to conlude it was ~50% faster than a 2080Ti. Yet when scrutinised it was under very limited scenarios where the RTX 2080 was runing into 8GB VRAM limits. Actual launch day reviews showed it being ~50% faster than a 2080 and 25% faster than a 2080Ti.

EDIT: Another thing that I remember being odd is they did not actually give FPS vs FPS but percentage increase numbers. So a lot of people couldn't go and compare the FPS they reported on their 2080 for example.

Here is the disclaimer DF had to include.

Full disclosure: I can bring you the results of key tests today, but there are caveats in place. Nvidia has selected the games covered, for starters, and specified 4K resolution to remove the CPU completely from the test results and in all cases, settings were maxed as much as they could be. The games in question are Doom Eternal, Control, Shadow of the Tomb Raider, Battlefield 5, Borderlands 3 and Quake 2 RTX. Secondly, frame-time and frame-rate metrics are reserved for the reviews cycle, meaning our tests were limited to comparisons with RTX 2080 (its last-gen equivalent in both naming and price) and differences had to be expressed in percentage terms
 
Last edited:
I could give you a link but it is easily found on Google. Essentially they were implying the RTX 3080 was ~80% faster than 2080 at 4K and many then used that to conlude it was ~50% faster than a 2080Ti. Yet when scrutinised it was under very limited scenarios where the RTX 2080 was runing into 8GB VRAM limits. Actual launch day reviews showed it being ~50% faster than a 2080 and 25% faster than a 2080Ti.

Ouch.
 
Any skilled person can prove a lot of half truths and lead you into believing alternatives truths, its called marketing.

Why not save me a lot of data dumping and pull up the DF RTX 3080 review and show me all your truths in that?

We're not talking about their "3080" or any gpu/hardware reviews, again moving goal posts... We're talking about their in depth articles on things like dlss, ray tracing and where they use "comparisons", also, they get footage/info. from game developers i.e. metro developers sent their workflow for ray tracing vs rasterised, something which not a single person on this forum has access to. If those comparisons were BS then feel free to post something that goes against their findings, doesn't have to be yourself, but surely if their content is so far from the truth (on the topic at hand i.e. dlss), someone out their must have disproved their claims, right???

What about gamers nexus and hardware unboxed where they have given dlss 2 the credit it deserves (wasn't it gamer nexus where they said something about dlss 2 in cyberpunk being "better than native"?) Suppose they must be shills???

HUB were told off by Nvidia for constantly saying DLSS sucked. It was only once DLSS 2.0 was introduced that HUB said it was now worth it. The problem I have with DLSS is not form a tehcincal level but from an adoption level and that it is proprietary and limited to RTX GPUs only.

So if DLSS is DLSS 1.5 or better but gains far more adoption rates I will be happy.

That was it, and rightly so, dlss 1 was a complete waste of everyone's time.
 
We're not talking about their "3080" or any gpu/hardware reviews, again moving goal posts... We're talking about their in depth articles on things like dlss, ray tracing and where they use "comparisons", also, they get footage/info. from game developers i.e. metro developers sent their workflow for ray tracing vs rasterised, something which not a single person on this forum has access to. If those comparisons were BS then feel free to post something that goes against their findings, doesn't have to be yourself, but surely if their content is so far from the truth (on the topic at hand i.e. dlss), someone out their must have disproved their claims, right???

What about gamers nexus and hardware unboxed where they have given dlss 2 the credit it deserves (wasn't it gamer nexus where they said something about dlss 2 in cyberpunk being "better than native"?) Suppose they must be shills???



That was it, and rightly so, dlss 1 was a complete waste of everyone's time.

That's a lot of "what-absolutism" I said DF are a marketing arm for Nvidia, i didn't say everything they have ever posted is a lie. You're not going to get me on the defensive by spewing a whole lot of out of context undefined crap for me to fall into justifying myself round in circles. Not happening.
 
the whole RT think turned it on playable frames but meh prefer lots of frames with less visual noise

but i only play mp games
 
You don't get it. Mario Cart isn't on PC.

Its completely relevant, Nvidia tried to lock you into their hardware with G-Sync, it didn't work, because Nvidia are no where near as big as AMD in the world of Gaming.

Wait... What? Switch uses DLSS. So, fake 4K? Urgh.

I find it amusing that FSR to me means First Strike Rounds (D shaped paintballs with high accuracy and better distance), so will have to continually remind myself about what setting I am in when I read the acronym.
 
Last edited:
That's a lot of "what-absolutism" I said DF are a marketing arm for Nvidia, i didn't say everything they have ever posted is a lie. You're not going to get me on the defensive by spewing a whole lot of out of context undefined crap for me to fall into justifying myself round in circles. Not happening.
I don’t get this. The 3080 for example they were very clear about the circumstances of the review. They’ve also done a boat load of stuff with Microsoft end covered a head of console architecture which is amd. Add to that there hadn’t been up until recently any direct match from amd regarding ray tracing and dlss.
 
Wait... What? Switch uses DLSS. So, fake 4K? Urgh.

I find it amusing that FSR to me means First Strike Rounds (D shaped paintballs with high accuracy and better distance), so will have to continually remind myself about what setting I am in when I read the acronym.

urghh millsim roots :P
 
Back
Top Bottom