What I meant by it is - the comparison for assessing its worth should be between alternative upscaling solutions & FSR.
So why would you need another upscaling method in UE5 games if it's that good? Just use TSR and that's it. The quality is good enough and no other method will give you more FPS.
It's not just about UE5 though, and the point is if you can't even match their implementation then you cut out a large part of the market in terms of how worthwhile it is. F.ex. Nvidia decided they'll just be the best so they put out DLSS as a quality alternative (but which requires hw buy-in). So then why spend resources there at all? It's not like AMD is drowning in software devs, they got plenty of things to work on, such as AMD's abysmal support in most professional workloads particularly anything A.I. related. That's why I'm saying this is just a marketing stunt, no different than what DLSS 1.0 was, except there NV used that influence to buy time and then push out something worthwhile - DLSS 2.0. In Nvidia's case they could do that because that's the nature of A.I. In AMD's case with what they've chosen to do they're stuck and nothing short of a complete rework (and abandoning a lot of other users, maybe including consoles) can compete. Except by doing it this way they spent a lot of time badly and are further behind than they were initially.
And besides, they could've done it much better by putting forth a general TSR-like equivalent (and thus accessible to consoles too) which at least would've been helpful because then that's a variant every dev has access to (not just Unreal) and that will save them some time and still is a useful tech. Instead by choosing to restrict themselves to spatial info only AMD chose literally the worst possible solution qualitatively, with it being barely better than not doing anything and just telling devs to add CAS (ala FX CAS upsample). This is basically FXAA 1.5 + sharpening.
Is it just me or is history repeating itself here?
I am getting Freesync vs Gsync vibe.
I remember Gsync being the better choice because it had £100 expensive hardware.
Well look how that ship sailed.
If Amd can pull off FSR to look just like native or there about because I am sure like DLSS it won't be perfect then does it really matter what approach each graphic manufacturer choose?
So long all users get this upsampling feature I couldn't care less about DLSS gives a slight better image if you zoom in 10x you can spot a leaf detail.
It's not the same at all. With Freesync vs Gsync FS could be equivalent qualitatively to GS, the difference being that in general monitor vendors chose to skimp on QA so it wouldn't always end up like that. Here on the other hand FSR will never be equivalent either qualitatively nor performance-wise to DLSS. So when you weigh up that feature it will end up skewing disproportionately in favour of NV GPUs, so AMD will have to be that much faster without it (LOL gl) if buying one is to make sense, sans perpetual shortages. Nevermind how far behind they are in RT performance, if we add all that up RDNA 3 vs Hopper/Lovelace will be even more of a slaughter than RDNA 2 vs Ampere has been, except here they've been lucky with shortages (and that it's still a transition period between the previous gen and what's next-gen).
Tbh I don't know why I even keep paying attention to this, I need to get off my bum and arrange a sale for my 6800. Just been hesitant to do in-person anything what with the bug and all. It's clear to me now that AMD is going to go into another coma period where they try to live off of solely being in the console space while re-calibrating for a future when they (hope to) catch up to NV, but right now they're making all the wrong moves on the GPU front and having an AMD GPU will be a major mistake for the next 2 gens (at least).
Time to keep an eye out for an LHR 3060 Ti.