• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

You'd have to be a complete numbskull to believe FSR is the same as TrixxBoost. If it was then AMD would implement it in the driver without having to work with devs. Try harder mate.

Already explained. AMD didn't want to put it into the driver even though they easily can, if it's in their driver then Nvidia and Intel won't use it. By putting into the game you force Nvidia and Intel GPU to use it.
 
Hot take: AMD is only trying to push FSR onto Nvidia so it can compete in a level playing field

AMD has no answer for DLSS, if they can push developers to use FSR they can get more favourable comparisons where Nvidia doesn't have the clear advantage in image quality. FSR = trying to lower your competitors superior image quality so you only have to compete on framerate

let's see how many jimmies this rustles
 
Hot take: AMD is only trying to push FSR onto Nvidia so it can compete in a level playing field

AMD has no answer for DLSS, if they can push developers to use FSR they can get more favourable comparisons where Nvidia doesn't have the clear advantage in image quality. FSR = trying to lower your competitors superior image quality so you only have to compete on framerate

let's see how many jimmies this rustles
Your statement is most likely correct but your reasoning is questionable.
 
To be fair, HDR600 was considered to be quite decent back in 2016 :D
PC monitors are only 5 years behind TV tech..

And you could say the same thing about TV only just catching up to monitors in refresh rates and response times.

PC gaming monitors over the last couple of years have made some very good advancement with IPS panel showing amazing responses times and refresh rates to complete with TN panels.

Show me a TV I could by 5 years ago with the specs of lg 27GP950

I'll wait.....

Edit: 6am

As I expected Grim5 posts rubbish and runs away when proven wrong.
 
Last edited:
it's the same thing ya - Trixx is a solution that only works on Saphire thats the problem - FSR packages the feature into an API that's compiled into the game for maximum compatibility. Same result, different implementation

You obviously haven't used Trixx Boost matey.
I have used it on my old Sapphrie 5700XT and it is nothing like FSR or DLSS. I initially thought it was some upscaling tech like DLSS but it certainly isn't. It simply creates a custom resolution slightly lower than your native monitor resolution and then relies on standard gpu scaling to fill the screen so you get a small boost in performance due to the lower resolution. Taking a screenshot of the game results in an image with the attributes of the custom resolution..not the upscaled monitor resolution. It does not attempt to process the image at all like FSR or DLSS, etc.

The fake BS you are propagating is not going to fool anyone. Jog on if you got nothing of worth to contribute.
 
What I meant by it is - the comparison for assessing its worth should be between alternative upscaling solutions & FSR.



It's not just about UE5 though, and the point is if you can't even match their implementation then you cut out a large part of the market in terms of how worthwhile it is. F.ex. Nvidia decided they'll just be the best so they put out DLSS as a quality alternative (but which requires hw buy-in). So then why spend resources there at all? It's not like AMD is drowning in software devs, they got plenty of things to work on, such as AMD's abysmal support in most professional workloads particularly anything A.I. related. That's why I'm saying this is just a marketing stunt, no different than what DLSS 1.0 was, except there NV used that influence to buy time and then push out something worthwhile - DLSS 2.0. In Nvidia's case they could do that because that's the nature of A.I. In AMD's case with what they've chosen to do they're stuck and nothing short of a complete rework (and abandoning a lot of other users, maybe including consoles) can compete. Except by doing it this way they spent a lot of time badly and are further behind than they were initially.

And besides, they could've done it much better by putting forth a general TSR-like equivalent (and thus accessible to consoles too) which at least would've been helpful because then that's a variant every dev has access to (not just Unreal) and that will save them some time and still is a useful tech. Instead by choosing to restrict themselves to spatial info only AMD chose literally the worst possible solution qualitatively, with it being barely better than not doing anything and just telling devs to add CAS (ala FX CAS upsample). This is basically FXAA 1.5 + sharpening.



It's not the same at all. With Freesync vs Gsync FS could be equivalent qualitatively to GS, the difference being that in general monitor vendors chose to skimp on QA so it wouldn't always end up like that. Here on the other hand FSR will never be equivalent either qualitatively nor performance-wise to DLSS. So when you weigh up that feature it will end up skewing disproportionately in favour of NV GPUs, so AMD will have to be that much faster without it (LOL gl) if buying one is to make sense, sans perpetual shortages. Nevermind how far behind they are in RT performance, if we add all that up RDNA 3 vs Hopper/Lovelace will be even more of a slaughter than RDNA 2 vs Ampere has been, except here they've been lucky with shortages (and that it's still a transition period between the previous gen and what's next-gen).

Tbh I don't know why I even keep paying attention to this, I need to get off my bum and arrange a sale for my 6800. Just been hesitant to do in-person anything what with the bug and all. It's clear to me now that AMD is going to go into another coma period where they try to live off of solely being in the console space while re-calibrating for a future when they (hope to) catch up to NV, but right now they're making all the wrong moves on the GPU front and having an AMD GPU will be a major mistake for the next 2 gens (at least).

Time to keep an eye out for an LHR 3060 Ti.

A lot of people complained AMD had no DLSS, now that they have people complain they have it.

Can you see why people get so cynical?
 
Your statement is most likely correct but your reasoning is questionable.

This, open starboards is preferable to proprietary lock in every time.

I don't get why people white knight eveything Nvidia do in trying to lock you into their hardware and try their hardest to crap on everything AMD do to make it available to everyone, its like a deep seated hate of AMD for constantly stealing their exclusive Nvidia features and giving it away.
 
This, open starboards is preferable to proprietary lock in every time.

I don't get why people white knight eveything Nvidia do in trying to lock you into their hardware and try their hardest to crap on everything AMD do to make it available to everyone, its like a deep seated hate of AMD for constantly stealing their exclusive Nvidia features and giving it away.

I agree free to use open standards are preferable. I also beleive AMD is a better company overall. The problem is that the proprietary crap that Nvidia pulls is simply better.
  • I was happy to keep the FPS provided by my 1080Ti and so when I upgraded it was simply for RT. Ampere's RT performance is up to 100% more than RDNA2.
  • I've been looking for a decent Freesync ultra wide monitor for my 3080, yet everything I look at mentions flicker at low frame rates.
  • I am very happy to see AMD launch FSR. It looks great for what it is, but again DLSS provides better image quality and for good reason.
And the final kick in the teeth? The price for the 6800XT is the same as the 3080.

So am I a fanboy white knighting Nvidia, while crapping on AMD, or just a consumer buying the best products at the time?

Our favourite website has a new article - No, AMD’s FSR (FidelityFX Super Resolution) Is Not A DLSS Alternative, And Here Is Why You Should Care - https://wccftech.com/no-amds-fsr-fi...-alternative-and-here-is-why-you-should-care/.
 
Last edited:
Amd wont optimise fsr for nvidia cards like the 1000 series, nvidia has to do it they said. So if nvidia doesnt join in to help, they wont get good results probably.

From WCCFtech -
Update 6/2/2021
AMD's Scott Herkelman has stated that they have no intention of optimizing FSR for NVIDIA GPUs and that NVIDIA should do that work. While it would have been a completely reasonable expectation in normal circumstances, the fact that AMD expounded on NVIDIA support, absorbed a ton of good press on this and is now basically back tracking makes it seem like a bait and switch situation. This also implies that FSR for NVIDIA users will be optimized only for Godfall unless NVIDIA wants to adopt the technology (which, in my opinion, they absolutely should for non-RTX cards).

AMD need to be careful with that. Nvidia are likey to put a FX at the end and add some proprietary code to improve it.
 
A lot of people complained AMD had no DLSS, now that they have people complain they have it.

Can you see why people get so cynical?

They might have something comparable to DLSS 1.0, but it is doubtful they have something that can compete with DLSS. We have to wait and see.


But who exactly is complaining that AMD are pushing FSR?


You have to understand, people are cyncial when Nvidia's DLSS 1.0 was heavily mocked with all kinds of claims about about image upscaling can never be as good as native and you can't create information out of thin air with jokes about CSI style magnification - now those very same people are happily claiming that AMD can do this magic and that image upscaling is fantastic. All the while, ignoring that AMD's current stated solution does not have access to the additional temproal information that provably allows DLSS 2.0 to match or exceed native resolution (with some caveats ).

It would help if the people claiming AMD's FSR is going to be so fantastic first apologize and retract their statements about the impossibility of ML based image reconstruction.
 
From WCCFtech -


AMD need to be careful with that. Nvidia are likey to put a FX at the end and add some proprietary code to improve it.


Since DLSS 2 is plug n play for most game developers now, any top game title that is looking to support FSR will likely add DLSS, especially with nvidia's developer relations.
 
Since DLSS 2 is plug n play for most game developers now, any top game title that is looking to support FSR will likely add DLSS, especially with nvidia's developer relations.

I was thinking of the older GTX cards. Perhaps Nvidia could use some of the DLSS data to enhance FSR within the GTX range.
 
I agree free to use open standards are preferable. I also beleive AMD is a better company overall. The problem is that the proprietary crap that Nvidia pulls is simply better.
  • I was happy to keep the FPS provided by my 1080Ti and so when I upgraded it was simply for RT. Ampere's RT performance is up to 100% more than RDNA2.
  • I've been looking for a decent Freesync ultra wide monitor for my 3080, yet everything I look at mentions flicker at low frame rates.
  • I am very happy to see AMD launch FSR. It looks great for what it is, but again DLSS provides better image quality and for good reason.
And the final kick in the teeth? The price for the 6800XT is the same as the 3080.

So am I a fanboy white knighting Nvidia, while crapping on AMD, or just a consumer buying the best products at the time?

Our favourite website has a new article - No, AMD’s FSR (FidelityFX Super Resolution) Is Not A DLSS Alternative, And Here Is Why You Should Care - https://wccftech.com/no-amds-fsr-fi...-alternative-and-here-is-why-you-should-care/.


Just to point out, Nvidia's RTX is not proprietary. Everything RTX related form a gaming perspective uses DX12 or vulkan APIs. Nvidia worked directly with Khronos to define the open standard ray tracing API calls . AMD cards can freely run all the RTX games, AMD just have to provide the driver support. The problem is, the performance isn't there.
 
I was thinking of the older GTX cards. Perhaps Nvidia could use some of the DLSS data to enhance FSR within the GTX range.


The thing is, if FSR sticks to spatial only data then technologies like TSR in UE5 will be far superior.

You also get diminishing returns on older hardware. All of these techniques work best on highly graphical complex modern games with ray tracing and all the bells and whistles which are hard to pump out at 4K On older hardware the overhead of FSR may become more significant and other bottlenecks in the game that don;t scale with resolution (anything geometry related), start to dominate.
 
https://wccftech.com/no-amds-fsr-fi...-alternative-and-here-is-why-you-should-care/

No, AMD’s FSR (FidelityFX Super Resolution) Is Not A DLSS Alternative, And Here Is Why You Should Care

While the feature will certainly help breathe new life into older GPUs, it is not a DLSS alternative and cannot be - any attempt to hype it up as a DLSS competitor would hurt the brand more than it helps.



TBH, it is possible that FSR is using some form of ML, and they have some patents, but overall their messaging has been mixed and if their is a CNN at play the model complexity is going to be very low to reduce the overhead.

From AMDs released result on the 1060 in quality mode, it doesn't actually look like ML is being used at all, the blurring looks like standard spatial upscalers with sharpening on top. DLSS 1 would create details in most places but failed in cases where the base data was insufficient (e..g re-creating text where no visible text is present in the low res input). AMD's screen shots show a general lack of detail full stop,not just edge cases failing.

And the real trick that makes everything work so well in DLSS 2, and often better than native is the temporal accumulation of additional data combined with state of the art convolutional image reconstruction that requires significant processing resources.
 
Back
Top Bottom