• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

FSR support for Nvidia graphics cards that are several years old?

You keep arguing against a point that nobody is making. Yes G-sync has some additional features that give a slight edge but ultimately the buying public (and most reviewers) feel those features were/are not always worth the asking price. Even mid-range Freesync monitors gives 95% of the experience much more expensive G-Sync ones do. So being open standard, cross platform, as good or almost as good while making more financial sense, is why Freesync now being the defacto standard.

So we can apply the same logic to FSR and DLSS if actual reviews of FSR show it is a good tech, that is almost as good or as good as DLSS, cross platform and more financial sense for developers. Then it will almost certainly become the defacto standard.
 
Last edited:
Can you really call DLSS a standard if it only works on a (subset) of nVidia tech? It's more like a proprietary tech. The role of a standard is to allow it to be used and work across multiple different products, devices, manufacturers, etc

Standard is a way to measure, a level of attainment.
 
Standard is a way to measure, a level of attainment.

DLSS was only a "standard" by virtue of the fact it was the only option. So if (I stress if) FSR can provide most of the benefits while being open standard and cross platform, then FSR will almost certainly become the defacto standard.
 
DLSS was only a "standard" by virtue of the fact it was the only option. So if (I stress if) FSR can provide most of the benefits while being open standard and cross platform, then FSR will almost certainly become the defacto standard.

Google the definition of standard.
 
my friend with 1060 is not that happy though. don't make conclusions based on just features alone without actual user opinions

the image quality tradeoff will not be worth for majority of users. they would prefer upgrading instead. but i can see with the stock issues, it is also imposible to do that. this friend in particular wanted to get a 3060 but sadly he can't, thanks to stock issues. (in short, he would prefer buying an actual new gpu that is capable of rendering native 1080p 60+ fps instead of downscaling with FSR which will not be that "good" on anything besides RDNA 2. I'm pretty sure it will be specially improved for RDNA2 to take its computational power, but aside from Turing GTX cards, all GTX cards before that suck at any kind of INT8/FP16 performance AI scaling may require
 
Google the definition of standard.

Lol, clearly you haven't because for something to be a standard there has to be alternatives.

Standard: something used as a measure, norm, or model in comparative evaluations

De Facto Standard: A de facto standard is a custom or convention that has achieved a dominant position by public acceptance or market forces. The term de facto standard is used to express the dominant voluntary standard, when there is more than one standard available for the same use.

If FSR is more widely adopted by developers and is cross platform in nature, then it will (by the above definition) become the De Facto Standard. It isn't a difficult premise to follow.
 
Last edited:
it will be the standart of making 4k games render at 1440p and make them look like 1512p at best case scenario XD

DLSS 1.0 was trully woeful and I hated it and the utter gash IQ it gave all in the name of some shiny shiny RT. But having said that DLSS 2.0 and 2.1 looks much better and actually work well in WDL and CP2077.
 
I'll narrow it down to make it easier.

You do realise I was using the "" to indicate I was being sarcastic? When FSR is released there will be two camparible solutions and as such one will become more dominant and by definition the De Facto Standard.
 
Last edited:
1060 needs FSR

performance-3840-2160.png
 
Tell me more about standards, I just can't get enough.

I have something else for you to consider "What will be, will be".

Ultimately, if a game supports either DLSS or FSR, I'll be a happy bunny if it allows me to play at 4K on my RTX 3070.
 
Last edited:

Interesting, but can't say I'm surprised. Hopefully, Nvidia will optimise it in their driver updates, I but won't hold my breath, it's not in their interests to make it look close to DLSS 2.0 in image quality.

Isn't it in Sony and Microsoft's interests to encourage console developers to use FSR to enable high framerates at 4K resolution in new PS5 / Xbox Series X game releases? I would have thought this would be the easiest way to bring FSR to PC games, by porting FSR console releases.

There's some speculation about FSR on consoles here:
https://www.digitaltrends.com/computing/what-is-amd-fidelityfx-super-resolution/
 
Last edited:
More to it than just range though.
of course there is, im outlining and open sync can do all if not most of what gsync can do. funnily enough gsync comes from opensync laptops ;)

again id rather have a choice of 2 options than no option at all. regardless again if one is better or not its still a choice.
 
new perspective to the topic:

when a typical casual gtx user enables this crap on their game and see how bad it is (and it will be, let's not kid ourselves) they will be prejudiced against it and most of them won't deign to research to get the knowledge of it being better on rdna 2 gpus. instead they will lean more on DLSS because no matter what happens, most people will still say "dlss is still better, period"

this might make FSR look bad on a lot of gamers' perspectives. just my 2 cents.
 
of course there is, im outlining and open sync can do all if not most of what gsync can do.

As per my post here https://forums.overclockers.co.uk/posts/34839975 adaptive sync has some missing features and/or more limited capabilities - there are some things having the FPGA allows you to do that current monitor hardware implementations can't do as adaptive sync is largely based around pressing capabilities like panel self refresh into use in ways they weren't originally intended.

funnily enough gsync comes from opensync laptops ;)

eDP variable refresh rate existed long before G-Sync but that link you are inferring isn't really true in the way you are meaning it. It was used to show off proof of concept but feature wise it isn't just a fork off of it - it is a ground up design even though there are similarities and shared areas. (IIRC "G-Sync" on laptops in many cases is just regular adaptive sync with all the same limits relative to what I said above - not sure if any use the FPGA G-Sync approach).

EDIT: The embedded/professional implementation of VRR seems to be part of the reason it took so long to come to desktop and/or why it is such a mess as apparently some who do professional signage such as air traffic control where such features are used were resistant to a standard in case it impacted their market.
 
Last edited:
As per my post here https://forums.overclockers.co.uk/posts/34839975 adaptive sync has some missing features and/or more limited capabilities - there are some things having the FPGA allows you to do that current monitor hardware implementations can't do as adaptive sync is largely based around pressing capabilities like panel self refresh into use in ways they weren't originally intended.



eDP variable refresh rate existed long before G-Sync but that link you are inferring isn't really true in the way you are meaning it. It was used to show off proof of concept but feature wise it isn't just a fork off of it - it is a ground up design even though there are similarities and shared areas. (IIRC "G-Sync" on laptops in many cases is just regular adaptive sync with all the same limits relative to what I said above - not sure if any use the FPGA G-Sync approach).

EDIT: The embedded/professional implementation of VRR seems to be part of the reason it took so long to come to desktop and/or why it is such a mess as apparently some who do professional signage such as air traffic control where such features are used were resistant to a standard in case it impacted their market.

i think maybe you missread my post and should perhaps read it again, the hint is MOST and again the point is the same most people want a choice and thats what we have with the current offering, a choice of 2 bad choices vs no choice is STILL a choice...

and yes edp is different, im still correct we wouldnt have any sync be it freesync of gsync if not for edp which is effecitvely opensync in laptop form.. regardless if its called edp or not. ps note i didnt say gsync i said opensync

quote below for you

"Interestingly, G-Sync for laptops makes use of the embedded DisplayPort (eDP) standard, a standardised interface for hooking up display panels directly to internal graphics cards. On the desktop, G-Sync can only be used with compatible monitors that contain Nvidia's G-Sync module.

According to Nvidia, the reason desktop displays need a G-Sync module is that it provides a much more controllable end-to-end solution for consistent performance. However, for G-Sync laptops, there's no module. Instead, the display is directly controlled by the GPU, which pulls double duty as both scaler and graphics card. G-Sync exploits this connection and the variable timing and panel self-refresh functionality built into eDP, effectively implementing G-Sync in software.

The more technically minded out there will note that this is very similar to how AMD's FreeSync works on the desktop, the tech being based DisplayPort Adaptive-Sync, which was in turn based on eDP."
 
Last edited:
yeah thats only when gaming.
my friend with 1060 is not that happy though. don't make conclusions based on just features alone without actual user opinions

the image quality tradeoff will not be worth for majority of users. they would prefer upgrading instead. but i can see with the stock issues, it is also imposible to do that. this friend in particular wanted to get a 3060 but sadly he can't, thanks to stock issues. (in short, he would prefer buying an actual new gpu that is capable of rendering native 1080p 60+ fps instead of downscaling with FSR which will not be that "good" on anything besides RDNA 2. I'm pretty sure it will be specially improved for RDNA2 to take its computational power, but aside from Turing GTX cards, all GTX cards before that suck at any kind of INT8/FP16 performance AI scaling may require

so he is not happy about something that is free and might work or might not work so he has nothing to lose



but to be fair I dont think it will do anything for anyone at 1080p
 
Back
Top Bottom