• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FSR 3.0 has exposed the ugly truth about most PC gamers

Soldato
Joined
15 Oct 2019
Posts
11,938
Location
Uk
Wrong. You’d have FSR FG just as good as DLSSFG then. Which is not the case. Just like with every amd vs nv feature, amd’s is the ‘we have x at home’ one.

Nvidia’s features are built to certain standards. And that’s why most people continue to buy Nv and completely ignore Amd. The numbers don’t lie. Wish things were different but, maybe intel can shake things up a bit?
I was talking about if Nvidia enabled FG on Nvidia RTX cards not AMDs free for all take on it.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
Wrong. You’d have FSR FG just as good as DLSSFG then. Which is not the case. Just like with every amd vs nv feature, amd’s is the ‘we have x at home’ one.

Nvidia’s features are built to certain standards. And that’s why most people continue to buy Nv and completely ignore Amd. The numbers don’t lie. Wish things were different but, maybe intel can shake things up a bit?

Pretty much. Again, it's great what amd do but literally they have no other choice lol....

- show up months/years later with said features
- features then take months/years to get better and even then I don't think said features ever end up being better?
- marketshare of < 10% still?

I still see people use gsync as an example for the freesync winning and no need for gsync module, it was just a way to lock customers in and whilst true to an extent (shock horror a company wanting to make profit by getting people into an ecosystem and upgrading!!!! :eek:), which as evidenced by actual experts in the field, it still does have its place now as stated (unless once again, people have something to prove them wrong otherwise?)


And a very recent post by blurbusters on the matter:


Native G-SYNC is currently the gold standard in VRR at the moment, and it handles VRR-range-crossing events much better (e.g. framerate range can briefly exit VRR range with a lot less stutter).

Using G-SYNC Compatible can work well if you keep your framerate range well within VRR range.

The gold standard is to always purchase more VRR range (e.g. 360-500Hz) to make sure your uncapped framerate range (e.g. CS:GO 300fps) always stays permanently inside the VRR range. Then you don't have to worry about how good G-SYNC versus G-SYNC Compatible handles stutter-free VRR-range enter/exit situations.

Also NVIDIA performance on FreeSync monitors is not as good as AMD performance on FreeSync monitors, in terms of stutter-free VRR behaviors.

Eventually people will need to benchmark these VRR-range-crossing events.

If you have a limited VRR range and your game has ever touched maximum VRR range in frame rate before, then the best fix is to make sure you use VSYNC ON and a framerate cap approximately 3fps below (or even a bit more -- sometimes 10fps below for some displays). Use VSYNC OFF in-game, but VSYNC ON in NVIDIA Control Panel.

Free/adaptive sync wasn't out for like 1-2 years till after gsync? And when it launched it had a whole heap of issues with black screening, no LFC and terrible ranges.

Yes, to get the best, you have to pay the premium to get said experience months/years before the competition but what's new here? Same goes for anything in life. If AMD were first to the market with quality solutions, you can be damn sure they would be locking it down or going about it differently to get people to upgrade. AMDs reveal at their amd event about frame generation was literally nothing but a knee jerk reaction in order to keep up appearences that they aren't falling behind nvidia hence why we still have only 3 titles with official FSR 3 integration and questionable injection method (that will get you banned if used for online games).

So the article summed it up perfectly:

Well, we all know why a lot of PC gamers picked their pitchforks. It wasn’t due to the extra input latency and it wasn’t due to the fake frames. It was because DLSS 3 was exclusive to the RTX40 GPU series, and most of them couldn’t enjoy it. And instead of admitting it, they were desperately trying to convince themselves that the Frame Generation tech was useless. But now that FSR 3.0 is available? Now that they can use it? Well, now everyone seems to be happy about it. Now, suddenly, Frame Generation is great. Ironic, isn’t it?
So yeah, the release of the AMD FSR 3.0 was quite interesting. And most importantly, the mods that allowed you to enable FSR 3.0 in all the games that already used DLSS 3.0. Those mods exposed the people who hadn’t tested DLSS 3 and still hated it. Hell, some even found AFMF to be great (which is miles worse than both FSR 3.0 and DLSS 3). But hey, everything goes out the window the moment you get a free performance boost on YOUR GPU, right? Oh, the irony…

I was talking about if Nvidia enabled FG on Nvidia RTX cards not AMDs free for all take on it.

DLSS 3 is built around the optical flow accelator, Bryan and a couple of other engineers have stated they could enable it but it wouldn't work well then people would trash how bad dlss 3 is thus better to keep it locked in order to retain the "premium" look, of course, they could take a different approach like amd but alas they then have 2 solutions to maintain and again, from their POV, what benefit do they have to this? Who knows though, maybe they will do a "dlss 3 compatible" version.
 
Soldato
Joined
15 Oct 2019
Posts
11,938
Location
Uk
Nvidia's FG is hardware based, An Nvidia engineer near the beginning of Ada's launch explained it pretty well that the OFA, Optical flow accelerator, Inside Ada is up to 4-5 times faster than the OFA inside Ampere and even with the 4-5 time increase in performance latency is still increased.
That’s what they’ll tell you but unlike pascal where they were more than happy to show you RT they haven’t shown FG running on ampere or Turing and until they do I frankly won’t believe them.
 
Soldato
Joined
19 Feb 2007
Posts
14,485
Location
ArcCorp
That’s what they’ll tell you but unlike pascal where they were more than happy to show you RT they haven’t shown FG running on ampere or Turing and until they do I frankly won’t believe them.

If they enable it on 3000 series then you and I both know we'll be seeing herds of people screaming and crying, Having full on nervous breakdowns "This doesn't work properly on Ampere OMG Nvidia scammed me reeeeeeeeeeeeeeeeeeeeeeeee"... it aint worth their time.
 
Soldato
Joined
18 Oct 2002
Posts
14,438
Location
West Midlands
If they enable it on 3000 series then you and I both know we'll be seeing herds of people screaming and crying, Having full on nervous breakdowns "This doesn't work properly on Ampere OMG Nvidia scammed me reeeeeeeeeeeeeeeeeeeeeeeee"... it aint worth their time.

I think it more likely that it was smoke and mirrors, let us not forget at the time of release they was a huge glut of 3000 series cards still to be sold, so why would you buy the 4000 series for an exclusive new feature which is what made the cards 'faster'?
Not sure why anyone would defend them not showing it working, or sorry, not working on 3000 series just to prove how good the 4000 hardware jump was and why they were worth buying over the 3000 even if you owned a 3000 already.
 
Soldato
Joined
19 Feb 2007
Posts
14,485
Location
ArcCorp
I think it more likely that it was smoke and mirrors, let us not forget at the time of release they was a huge glut of 3000 series cards still to be sold, so why would you buy the 4000 series for an exclusive new feature which is what made the cards 'faster'?
Not sure why anyone would defend them not showing it working, or sorry, not working on 3000 series just to prove how good the 4000 hardware jump was and why they were worth buying over the 3000 even if you owned a 3000 already.

Not everything has to be a conspiracy, Sometimes things are just simple, 3000 series lacks the dedicated silicon for it and enabling it on 3000 series would create more problems than it's worth, That's it.
 
Last edited:
Soldato
Joined
30 Mar 2010
Posts
13,094
Location
Under The Stairs!
We all know NV/Amd would never lie about the capabilities of their cards...

Fully compliant DX12 they said with Maxwell, made a huge deal about being the only fully compliant DX12 GPUs at that time.

Then when the 9 series was getting absolutely pumped in DX12, they released a statement saying they never enabled Asynchronous Compute in the drivers.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
Not everything has to be a conspiracy, Sometimes things are just simple, 3000 series lacks the dedicated silicon for it and enabling it on 3000 series would create more problems than it's worth, That's it.

Post from another thread but it sums it up well :D

Lol. So the reason you guys don't like nvidia is because you don't like capitalism. Got it.
 
Soldato
Joined
18 Oct 2002
Posts
14,438
Location
West Midlands
Not everything has to be a conspiracy, Sometimes things are just simple, 3000 series lacks the dedicated silicon for it and enabling it on 3000 series would create more problems than it's worth, That's it.

I never said everything was, but protecting your operating profit is enough to mislead, or generally misdirect is enough justification, if you can't see that could be the case then you have insider knowledge.
 
Soldato
Joined
19 Feb 2007
Posts
14,485
Location
ArcCorp
We all know NV/Amd would never lie about the capabilities of their cards...

Fully compliant DX12 they said with Maxwell, made a huge deal about being the only fully compliant DX12 GPUs at that time.

Then when the 9 series was getting absolutely pumped in DX12, they released a statement saying they never enabled Asynchronous Compute in the drivers.

IMO I just don't think it's that deep, Ada has on silicon hardware that Ampere doesn't, Enabling it on Ampere would inevitably start the screeching brigade "why isn't my 3080 running the same as a 4080 this is BS reeeeeeeeeeeee" and Nvidia likely cannot be bothered with that noise so it's easier to leave AMD to pickup that slack with their hardware agnostic FG implementation.
 
Soldato
Joined
30 Mar 2010
Posts
13,094
Location
Under The Stairs!
I never said everything was, but protecting your operating profit is enough to mislead, or generally misdirect is enough justification
This, with Pascal(launched May 2016) VRR, wonder how many G-Sync units were sold before they backtracked in 2019 and enabled VRR that could have been enabled at launch.
IMO I just don't think it's that deep, Ada has on silicon hardware that Ampere doesn't, Enabling it on Ampere would inevitably start the screeching brigade "why isn't my 3080 running the same as a 4080 this is BS reeeeeeeeeeeee" and Nvidia likely cannot be bothered with that noise so it's easier to leave AMD to pickup that slack with their hardware agnostic FG implementation.

Ada already has on silicon hardware too, at this point all we know is Nv said it's not fast enough, complaining a 3080 wasn't as fast as a 4080 would be slapped down and mocked, how a 3080 12Gb performed against a 4070 would be a better metric to use.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
IMO I just don't think it's that deep, Ada has on silicon hardware that Ampere doesn't, Enabling it on Ampere would inevitably start the screeching brigade "why isn't my 3080 running the same as a 4080 this is BS reeeeeeeeeeeee" and Nvidia likely cannot be bothered with that noise so it's easier to leave AMD to pickup that slack with their hardware agnostic FG implementation.

Sure look at what happend with FG on the 40xx at launch, we had the usual suspects selectively picking scenes which purposely threw dlss 3/fg to have a fit and then people taking these "fake frames" to show how awful it was and even the source of their hate spiel stated they had to slow the footage down in order to pick out the fake frame, which you wouldn't have noticed during normal gameplay e.g.

Fn6bt6Yh.png


And the 2 real frames before and after the fake frame:

gOfazBjh.png


nTGG70ih.png


i.e. changed the camera view point in order to produce a garbled frame lol

So can only imagine what it would have been like with ampere running FG on worse hardware. Also, since dlss3/FG is closed source, nvidia could very easily make it run worse and no one would know if it was intentional or not.

It's a bit like DLSS and the tensor cores all over again, "you don't need tensor cores as they aren't being used for dlss!!!!" and here we are:

 
Soldato
Joined
15 Oct 2019
Posts
11,938
Location
Uk
Sure look at what happend with FG on the 40xx at launch, we had the usual suspects selectively picking scenes which purposely threw dlss 3/fg to have a fit and then people taking these "fake frames" to show how awful it was and even the source of their hate spiel stated they had to slow the footage down in order to pick out the fake frame, which you wouldn't have noticed during normal gameplay e.g.

Fn6bt6Yh.png


And the 2 real frames before and after the fake frame:

gOfazBjh.png


nTGG70ih.png


i.e. changed the camera view point in order to produce a garbled frame lol

So can only imagine what it would have been like with ampere running FG on worse hardware. Also, since dlss3/FG is closed source, nvidia could very easily make it run worse and no one would know if it was intentional or not.

It's a bit like DLSS and the tensor cores all over again, "you don't need tensor cores as they aren't being used for dlss!!!!" and here we are:

I think by locking FG to ADA nvidia has tried to hide the fact that it isn’t actually that great in its current state and is more of a beta feature.

People who then go and buy the new ADA cards will say it’s good because they don’t want to admit they paid a lot of money for GPUs which in most cases are barely or no than last gen for the price FG aside.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
I think by locking FG to ADA nvidia has tried to hide the fact that it isn’t actually that great in its current state and is more of a beta feature.

People who then go and buy the new ADA cards will say it’s good because they don’t want to admit they paid a lot of money for GPUs which in most cases are barely or no than last gen for the price FG aside.

Well that depends entirely on where you look and what you choose to ignore, there are plenty of people who say DLSS 3 is great and works well, not just on this forum but by several reputable sources too, even bang 4 buck (who owns all top end hardware values it) i.e. by people who have no need to justify their purchase as they either can afford it or/and didn't pay anything for it in the first place. Now if you tell me that you're trying to run FG on a 4060 at 4k where base fps is like 20, well yeah, FG isn't a miracle worker.... Depending which ada gpu you bought, ampere owners will have got an upgrade overall, even if they bought the same tier of gpu. @TNA how much faster is your 4070ti compared to your previous 3080ti at your res again? Not even factoring in frame gen.

Also, if the polls recently on native, upscaling etc. are anything to go by, turns out people don't even use nor want software solutions to improve fps therefore surely it isn't a factor when buying these new gpus? :confused:

I do agree though, FG as a whole is somewhat still too new/beta, as in, for best results, you need to be getting 50-60 fps already. My experience with the 4080 on geforce now was fantastic, even the lag wasn't a huge issue and that was on cloud streaming.... AMDs FSR 3 official integrations have been awful and being locked to fsr upscaling makes it a no go for many, even amd owners as shown. The mod injector method is hit and miss so again, much like other technology, to experience the best as of "now", you have to pay that premium. Personally for me, it's akin to turing when dlss and rt came about, there simply aren't enough demanding games to really justify the need for FG yet, CP 2077 is the main one and I've already completed that 3 times now so meh and AW 2 ran fairly well given slow paced game so by the time there are more games where FG will be necessary, we'll have the next gen series out, which will be even better and probably have something else extra.

EDIT:

Also, nvidia made it very clear in their benchmarks/pr slides what the performance was without FG so it wasn't even a case of misleading the crowd unless again, people only look at bars and nothing else. AMD on the other hand when it comes to their PR/slides, well... the less said the better.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,362
Location
Greater London
Well that depends entirely on where you look and what you choose to ignore, there are plenty of people who say DLSS 3 is great and works well, not just on this forum but by several reputable sources too, even bang 4 buck (who owns all top end hardware values it) i.e. by people who have no need to justify their purchase as they either can afford it or/and didn't pay anything for it in the first place. Now if you tell me that you're trying to run FG on a 4060 at 4k where base fps is like 20, well yeah, FG isn't a miracle worker.... Depending which ada gpu you bought, ampere owners will have got an upgrade overall, even if they bought the same tier of gpu. @TNA how much faster is your 4070ti compared to your previous 3080ti at your res again? Not even factoring in frame gen.

Also, if the polls recently on native, upscaling etc. are anything to go by, turns out people don't even use nor want software solutions to improve fps therefore surely it isn't a factor when buying these new gpus? :confused:

I do agree though, FG as a whole is somewhat still too new/beta, as in, for best results, you need to be getting 50-60 fps already. My experience with the 4080 on geforce now was fantastic, even the lag wasn't a huge issue and that was on cloud streaming.... AMDs FSR 3 official integrations have been awful and being locked to fsr upscaling makes it a no go for many, even amd owners as shown. The mod injector method is hit and miss so again, much like other technology, to experience the best as of "now", you have to pay that premium. Personally for me, it's akin to turing when dlss and rt came about, there simply aren't enough demanding games to really justify the need for FG yet, CP 2077 is the main one and I've already completed that 3 times now so meh and AW 2 ran fairly well given slow paced game so by the time there are more games where FG will be necessary, we'll have the next gen series out, which will be even better and probably have something else extra.

EDIT:

Also, nvidia made it very clear in their benchmarks/pr slides what the performance was without FG so it wasn't even a case of misleading the crowd unless again, people only look at bars and nothing else. AMD on the other hand when it comes to their PR/slides, well... the less said the better.

If you believe Joxeon it is 1% faster than a 3080 :p
 
Soldato
Joined
15 Oct 2019
Posts
11,938
Location
Uk
I’ve tried FG at 4k on my mates 4090 so (best case scenario) at different points over the past year, on a plagues tale it would introduce noticeable artifacts and would randomly cause crashes, on the Witcher 3 if would artifact really badly especially in fast motion, on cyberpunk it would also cause noticeable artifacts and on ASA it would cause flickering and regular crashes.

So no I would not say it’s a great feature right now, it has potential but also has a long way to go before I’d consider it a good feature that is worth paying extra for.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,372
I’ve tried FG at 4k on my mates 4090 so (best case scenario) at different points over the past year, on a plagues tale it would introduce noticeable artifacts and would randomly cause crashes, on the Witcher 3 if would artifact really badly especially in fast motion, on cyberpunk it would also cause noticeable artifacts and on ASA it would cause flickering and regular crashes.

So no I would not say it’s a great feature right now, it has potential but also has a long way to go before I’d consider it a good feature that is worth paying extra for.

Is this based on launch? Cause if so, most of the issues i.e. UI issues and so on were resolved.

I didn't see any artifacts in cp 2077 with the 4080 nor had any crashing, only briefly tried it on plague tale and darktide and it seemed good there too.

Ark survival ascended is a buggy awful game in its self that crashes and just runs poorly regardless of FG. Although I find the modded in FG to actually work very well for this on my 3080.

Daniel Owens coverage is the best overall on it, it's a worthy feature of having as it does make a positive impact to what it seeks out to resolve i.e. motion fluidity and a smoother experience and overcoming bottleneck issues especially where cpu optimisation is awful.


 
Back
Top Bottom