• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

Hard facts : "the tensors are hundreds of times faster..."
Here is the comparision with Pascal: Remember that unlike Pascal, the RDNA 2 can also accelerate hardware the tensors.
NVIDIA-Volta-GV100_Tesla-V100_Performance.jpg


It will work on Nvidia cards no matter what. Maybe not as good (or as bad :D ) as it will work on AMD cards but it will still work, even without their support.

So AMD are doing their bit to support Nvidia GPU's.
 
Like I said blissfully blind when you do it.



Doesn't have any escape clauses, just a bold statement.

Nvidia was having nothing to do with Freesync until it did yet you're happy to predict the future on FSR.

I would assume you are replying to D.P because he has a habit of stating his opinion as fact, even when his opinion has been debunked. For example stating FSR is pointless in consoles, yet I have posted evidence stating that MS have already cinfirmed they will use it for Xbox X and S. Or he ignores that DLSS is proprietary and has limited application due to being compatible with RTX cards only.

Just ignore him and wait for actual reviews. Ironically I am on the fence on if it will be good or not, because I refuse to buy in to AMDs marketing hype. So if it works then great, if not we move on. Yet people like D.P. and Grim want it to fail because it came from AMD and not Nvidia.
 
I do remember a certain fanatical forum user, putting a bet that Ryzen would never use 3200 ram.... he lost that bet then didnt pay up either.....
 
Do you have a time machine for that?

All well and good posters complaining about speculation of FSR, but then the biggest advocates of FSR are the ones not using facts.

FSR might have a bigger customer base, but we won't know. And this depdns on many caveats that FSR is any anyway half decent, even on Nvidia hardware.

I expect it will get better. Given some of the patents AMD have released and the state of the art, I expect AMD will be trying to keep their options open to allow allow temporal data in the future. This may not be possible under their current hardware as the ML models get much larger, but when AMD have their equivalent of Tensor Cores in RDNA3 they will then be in a position to do their own DLSS 1 to 2 type transition and ramp up the quality.

heres 1 of those time machine posts presented as facts.
 
Last edited:
I would assume you are replying to D.P because he has a habit of stating his opinion as fact, even when his opinion has been debunked. For example stating FSR is pointless in consoles, .


I never stated that, go quote me:rolleyes:


I almost forgot how terrible this part of the forum is
 
I think it's more to try and sideline DLSS rather than do nvidia customers a favour.

I would say they took a negative and turned it into a marketing positive. FSR will be open source so it was always going to be a thing on older Nvidia H/W so why not play on that fact? Think of it like AMD saying "Nvidia forgot about you 1000 series owners, but we haven't". Some tech-tubers are already saying it was a good marketing ploy by AMD to show it on a 1660 GTX.
 
I think it's more to try and sideline DLSS rather than do nvidia customers a favour.


It is mostly to market FSR as a DLSS alternative and suppress DLSS as being an advantage when buying a Nvidia GPU. They have absolutely interest in making it work on Nvidia GPU,=s which is why AMD immediately walked back their remarks and said it would be up to Nvidia to properly implement.
 
heres 1 of those time machine posts presented as facts.
And where have I made a statement of a future event as if it is a fact? I have provided a projection, and absolutely clearly made it clear. You see this by the repeated use of the phrase "I expect"


There is a clear difference between proclaiming something to be true, and making a prediction. Claiming that FSR will have greater game support than DLSS requires a time machine.

I have been extreely clear in stating what the current facts are, and what is unknown, and what may happen in the future. These are thee distinct discussion points.
 
Yeah I definetly think its a good thing and while DLSS has proved somewhat useful in a few games that I have which support it the fact that it's not in all games makes it much less appealing, now if FSR was in all games yet the image quality wasn't quite as good as DLSS I would still find it a much more useful feature.
 
I'm honestly just curious to know if they sort out the blurring lol.


If you mean the blurring and ghosting from TAA, then no.

And for those that can only think AMD is god's gift to the gamer, AMD stated publicly to Anandtech that FSR does not use temporal accumulation and is purely spatial. Therefore, it has to be applied after TAA, and all that blurred image artifacts will still be present.
 
If you mean the blurring and ghosting from TAA, then no.

And for those that can only think AMD is god's gift to the gamer, AMD stated publicly to Anandtech that FSR does not use temporal accumulation and is purely spatial. Therefore, it has to be applied after TAA, and all that blurred image artifacts will still be present.
What did AMD ever do to you D.P., can we mend this relationship? :p
 
Yeah I definetly think its a good thing and while DLSS has proved somewhat useful in a few games that I have which support it the fact that it's not in all games makes it much less appealing, now if FSR was in all games yet the image quality wasn't quite as good as DLSS I would still find it a much more useful feature.


But here again is the caveat, " if FSR was in all games". AMD have stated publicly that FSR requires developer integration in to the game engine. It is not a post processing effect like RIS or the saphhire trixx thingy. Therefore it will entirely depend on if developers want to support it or not, and there are many reason why a developer may or may not.
 
What did AMD ever do to you D.P, can we mend this relationship? :p


Nothing, I love AMD's CPU's and I think RDNA 2 is a fantastic leap forwards and the whole company is in a far healthier place than it was 5 years ago, which gives no end of positive benefits to absolutely everyone. I just don't like to circle jerk in the AMD fandom. I am more interested in the technology. DLSS is a fascinating advance. I work with deep learning professionally so love seeing this tech in consumer products. I am interested in FSR and what on earth AMD is up to because frankly, if it doesn't use temporal accumulation or deep learning then I would be gobsmacked if the quality is actually anything it is cracked up to be because information theory puts a a hard limit on what is possible.
 
And where have I made a statement of a future event as if it is a fact? I have provided a projection, and absolutely clearly made it clear. You see this by the repeated use of the phrase "I expect"


There is a clear difference between proclaiming something to be true, and making a prediction. Claiming that FSR will have greater game support than DLSS requires a time machine.

I have been extreely clear in stating what the current facts are, and what is unknown, and what may happen in the future. These are thee distinct discussion points.
"but when AMD have their equivalent of Tensor Cores in RDNA3 they will then be in a position to do their own DLSS 1 to 2 type transition and ramp up the quality."
 
I don't have high expectations, i think it will not look nice except on the highest setting and even if by a miracle it will look fine, it will be trashed in many reviews because there are too much money involved in promoting DLSS.
On the other hand i don't think AMD had any other option, those who talk about AMD making their own ML upscaler have no idea what they talk about. Ok they can make one, spend a fortune and then what? You still need to fight Nvidia trying to implement your tech in every game. And the ML will only work on RDNA2.
Look what happens with the laptop market, the dirty tricks Nvidia and Intel are using in that market. And there we talk about loads of money, it is much easier for Nvidia to control the PC gaming market.

Really the only solution for AMD was to make an open source feature that can also work on older generations. This can put a lot more pressure on the game devs than an RDNA2 exclusive feature.
I don't think it is a gift, it is a battle in a big war and i don't think AMD are happy promoting such a feature. The same goes for Nvidia i don't think they are happy with AMD "gift" to players and they are not too happy with their DLSS either but it helped them to sell two generations already. Instead of investing harder in creating better cards they paid far less money to the reviewers and now there are a lot of gamers who think fake resolution is the future of gaming. :)
DLSS is good when you sell a new gen but it will not be as good when you need to convince the DLSS capable cards owners that they should buy your new product. That's why i say that Nvidia is not too happy with DLSS either.
 
Nothing, I love AMD's CPU's and I think RDNA 2 is a fantastic leap forwards and the whole company is in a far healthier place than it was 5 years ago, which gives no end of positive benefits to absolutely everyone. I just don't like to circle jerk in the AMD fandom. I am more interested in the technology. DLSS is a fascinating advance. I work with deep learning professionally so love seeing this tech in consumer products. I am interested in FSR and what on earth AMD is up to because frankly, if it doesn't use temporal accumulation or deep learning then I would be gobsmacked if the quality is actually anything it is cracked up to be because information theory puts a a hard limit on what is possible.
less amd users than nvidia so technically less circle jerking going off :D:p

ps you use deep learning for architecture if i recall correctly? or am i remembering wrong (often i do)
 
Back
Top Bottom