• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FidelityFX, thoughts?

It's good to have options. The other thread just went south with pro and anti-NV posts. More options is not a bad thing.

I'm enjoying my Freesync screen, sharpening and hoping to make use of AMDs upsampling on my OLD hardware, while waiting for reasonable prices to return.

Doubt it will be DLSS2 quality, but if you don't want to spend £800+ on a mid range card...
 
I think everyone would be surprised if there weren't any bugs and performance issues to be ironed out on launch. Still, commendable that it is non-proprietary and applicable to older cards.
 
It works with Nvidia cards by default (supposedly), which means I don't even think it's up to Nvidia, it's up to developers by the looks of it
I believe what they mentioned is that even though it works with Nvidia cards, Nvidia needs to make Driver optimizations to get good performance out of it, Which I believe if they do, they might be able to implement it better than AMD since they have extra Tensor and RT cores that they can offload it to if the game doesn't already support DLSS
 
makes less reason to get an amd GPU though?
Actually opposite, If FSR becomes the standard the same way Freesync and also according to leaks AMD is planning to implement HW support for FSR then Nvidia might find themselves in a tricky situation since they have spent a long time developing DLSS and also they have to spend a lot of money and time to develop on top of the FSR to work good with Nvidia cards and adapt the Tensor cores for the FSR implementation if they could possibly offload it to HW
 
FSR being an open implementation will definitely give it benefits vs DLSS in terms of adoption.

I don't think it will have the same visual quality of DLSS at 1st, but if it does get adopted by other vendors (like Intel) it will definitely improve.

I'm hoping for a freesync situation, where it ends up adopted as more of a standard implementation. It always ends up better for consumers this way.
 
I think it's very possible that AMD have pulled off another Freesync here. Something that will ultimately prove to be more popular that than the NVIDIA alternative. My reason for thinking that is development times. It seems that it's far easier to integrate in to games than DLSS. Dunno, we will have to wait and see but it is really exciting.
 
My take on it is I love that it works on both AMD and Nvidia cards that is very cool but from what I've seen the quality of it is a bit garbage. It's one thing to increase resolution from a lower res to what you want but with how blurry it appears to be just kind of defeats the purpose. I guess it's the age old debate though. Are you looking for visuals or performance? I currently have a 1080 Ti so DLSS isn't an option for me. When it comes along I'll give it a go and see.
 
tbh all the consoles now run on second gen AMD tech so if anything this has raised the expectations and support levels
 
I found the announcement a little underwhelming. In one of the comparison, the image used for fidelity looked outright blurry. I’ve used dlss and on the higher presets it’s actually difficult to tell the difference. I think this is going to be very similar to dlss 1.0
 
I think it's very possible that AMD have pulled off another Freesync here. Something that will ultimately prove to be more popular that than the NVIDIA alternative. My reason for thinking that is development times. It seems that it's far easier to integrate in to games than DLSS. Dunno, we will have to wait and see but it is really exciting.


DLSS is a click of a button in any Unreal or Unity engine. Can't get easier than that. Even for other game engines, as long as it supports TAA, which is basically every modern game, then manual DLSS integration is a few days work.
 
I'm on the fence about it. On one hand it's nice to see competition because that drives innovation, but this doesn't seem like a competitor to DLSS. From what I understand, DLSS is done far earlier in the rendering pipeline, so it doesn't mess with things like the UI. AMDs solution seems to be a filter or shader after the entire frame is assembled, and a lot of people have picked up on how it seems to blur the image. There is also the fact that they said it would be multi-manufacturer compatible at their event, but afterwards clarified that nvidia would have to implement their own version if they wanted it. Seemed a little like a bait and switch to me :(
 
DLSS is a click of a button in any Unreal or Unity engine. Can't get easier than that. Even for other game engines, as long as it supports TAA, which is basically every modern game, then manual DLSS integration is a few days work.

here comes the DLSS squad i thought this was covered 50000 times in the other thread ;)
 
DLSS is a click of a button in any Unreal or Unity engine. Can't get easier than that. Even for other game engines, as long as it supports TAA, which is basically every modern game, then manual DLSS integration is a few days work.


Now that the plugin is available on the Unreal Engine Marketplace, developers making projects in Unreal Engine 4 can “start using DLSS right away,” said Lin. Adding DLSS support to a game will probably take a bit more work than simply flipping a switch, but Lin said that Nvidia has “been able to fine tune the integration to work well out of the box for the vast majority of content.”

so not simple as just flicking a switch but getting better
 
DLSS is a click of a button in any Unreal or Unity engine. Can't get easier than that. Even for other game engines, as long as it supports TAA, which is basically every modern game, then manual DLSS integration is a few days work.

Uh oh, DLSS fan boys.
 
I'm on the fence about it. On one hand it's nice to see competition because that drives innovation, but this doesn't seem like a competitor to DLSS. From what I understand, DLSS is done far earlier in the rendering pipeline, so it doesn't mess with things like the UI. AMDs solution seems to be a filter or shader after the entire frame is assembled, and a lot of people have picked up on how it seems to blur the image. There is also the fact that they said it would be multi-manufacturer compatible at their event, but afterwards clarified that nvidia would have to implement their own version if they wanted it. Seemed a little like a bait and switch to me :(

It doesn't blur the image as such - the data just isn't there to not be blurry - the current implementation seems to be hugely behind temporal upscaling approaches never mind AI/DL driven.
 
Back
Top Bottom