• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FSR 3.0 has exposed the ugly truth about most PC gamers

Caporegime
Joined
4 Jun 2009
Posts
31,823

Very good article imo.

It's obviously great getting more choice and people who don't have the latest greatest gpus getting new tech to play with and improve their gaming experience but it really does just highlight the hypocrisy that goes on especially by these bigger tech youtube channels like HUB, obviously they do it to get the views thus more money but I'm sure they could take a more neutral stance and still do well i.e. like Daniel Owen on the FG topic.
 
lol, what a load of tripe that article is. Oh boohoo people are mean to poor Nvidia.

Are the points he made wrong though?

This summed it up perfectly for those who don't want to read the full article and it's not even neccasrily a nvidia/amd thing, it is just pointing out the hypocrisy.

Well, we all know why a lot of PC gamers picked their pitchforks. It wasn’t due to the extra input latency and it wasn’t due to the fake frames. It was because DLSS 3 was exclusive to the RTX40 GPU series, and most of them couldn’t enjoy it. And instead of admitting it, they were desperately trying to convince themselves that the Frame Generation tech was useless. But now that FSR 3.0 is available? Now that they can use it? Well, now everyone seems to be happy about it. Now, suddenly, Frame Generation is great. Ironic, isn’t it?
So yeah, the release of the AMD FSR 3.0 was quite interesting. And most importantly, the mods that allowed you to enable FSR 3.0 in all the games that already used DLSS 3.0. Those mods exposed the people who hadn’t tested DLSS 3 and still hated it. Hell, some even found AFMF to be great (which is miles worse than both FSR 3.0 and DLSS 3). But hey, everything goes out the window the moment you get a free performance boost on YOUR GPU, right? Oh, the irony…
 
Really, what is it with this forum and people who love to take things out of context to suit narrative, it literally goes against the very things you deem as being "fanboy", "having bias" and so on or is it literally just baiting/trolling? In which case, maybe read the stickied post at the top of this sub forum.

Daniel_Owen_native_res_does_produce_the_best_output.png

A video about upscaling in reply to a comment on specifically "frame generation"? Lets look at another video of his about upscaling and play the "out of context" game:

nrjLIgY.png


See how easy it is.....

Maybe see this:

In which case, maybe read the stickied post at the top of this sub forum.

Keep to thread topic.

maybe this gamer always dreamed of using fake frames, But couldn't afford an RTX4000 GPU?? Or maybe it's a different person??

It's in reference to people who stated about DLSS 3 being fake/**** because of it having unusable input lag, UI issues, ghosting, artifacts and so on and being a way of cheating to then getting to use frame gen with FSR 3 and proclaiming it as the best thing ever, even though as shown, it has all those issues but worse than dlss 3 i.e. hypocrisy:

Well, we all know why a lot of PC gamers picked their pitchforks. It wasn’t due to the extra input latency and it wasn’t due to the fake frames. It was because DLSS 3 was exclusive to the RTX40 GPU series, and most of them couldn’t enjoy it. And instead of admitting it, they were desperately trying to convince themselves that the Frame Generation tech was useless. But now that FSR 3.0 is available? Now that they can use it? Well, now everyone seems to be happy about it. Now, suddenly, Frame Generation is great. Ironic, isn’t it?
So yeah, the release of the AMD FSR 3.0 was quite interesting. And most importantly, the mods that allowed you to enable FSR 3.0 in all the games that already used DLSS 3.0. Those mods exposed the people who hadn’t tested DLSS 3 and still hated it. Hell, some even found AFMF to be great (which is miles worse than both FSR 3.0 and DLSS 3). But hey, everything goes out the window the moment you get a free performance boost on YOUR GPU, right? Oh, the irony…
 
My example isn't out of context, he's generalising native is better, which was the intended point.

The same as Tim@Hub states 'I wouldn't say DLSS looks better than native in general' which is similar to my preference on the whole matter be it upscaling/FG on the whole.



A picture tells a story, however, he's stated 'here' DLSS looks better in that instance, as in-one game, it's plain old English mate that's how easy it is.

The thread/article is about pc gamers being hypocrites i.e. this

Well, we all know why a lot of PC gamers picked their pitchforks. It wasn’t due to the extra input latency and it wasn’t due to the fake frames. It was because DLSS 3 was exclusive to the RTX40 GPU series, and most of them couldn’t enjoy it. And instead of admitting it, they were desperately trying to convince themselves that the Frame Generation tech was useless. But now that FSR 3.0 is available? Now that they can use it? Well, now everyone seems to be happy about it. Now, suddenly, Frame Generation is great. Ironic, isn’t it?

So yeah, the release of the AMD FSR 3.0 was quite interesting. And most importantly, the mods that allowed you to enable FSR 3.0 in all the games that already used DLSS 3.0. Those mods exposed the people who hadn’t tested DLSS 3 and still hated it. Hell, some even found AFMF to be great (which is miles worse than both FSR 3.0 and DLSS 3). But hey, everything goes out the window the moment you get a free performance boost on YOUR GPU, right? Oh, the irony…

Not necessarily about frame gen and certainly not about upscaling being better/worse than native.

Maybe you need to watch that video then as he even said in that video you refer to "frame gen is for another video":

AOYYRvJ.png


The entire video is about upscaling vs native.....

It's pretty obvious what your post is intending to do:

f6tKWct.png


So best to drop it and keep to the topic.









As for my take on the whole thing, I've never had a problem with any of the technologies in order to achieve better performance. Loved checkerboarding on the ps 4 pro with the exclusives, at the time, the PC had nothing close to this so it's great to see new technology that is surpassing checkerboarding methods and the normal method on consoles which is adaptive resolution in order to maintain fps targets. Frame gen is another step in achieving a more efficient method of increasing performance.

At the end of the day, MLID is somewhat true and until there is another breakthrough, we have to embrace new ways of doing things or simply accept that you'll have to run lesser fps or/and lesser graphical settings.

It's the end result I care about, couldn't care less how things are done as at the end of the day, every single frame in a "game" is "fake" and I imagine we'll see more things like this in the future. AMDs frame gen/FSR 3 is in a much better spot than FSR 2 as shown, with fsr 3/fake frames, it is nigh on impossible to pick them out compared to issues introduced by poor TAA methods or/and fsr upscaling issues in certain games, which plague "every" frame. Nvidias frame gen is ahead but then they use hardware and are ahead of the game here with time but it also suffers the same issues which fsr 3 does i.e. UI, latency, needing a good base fps and so on. Whilst the tech for frame gen is usable now by both camps somewhat, it will be the next gen where it will be much more viable imo.
 
Last edited:
  • Like
Reactions: TNA
When I see videos like this:


Then articles like this.


Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures

I think frame gen is going to be a far bigger deal than upscaling.

But obviously combining both upscaling and frame gen will net the biggest improvement when used together.
 
It's literally in the name. Frame "generation". It's making up extra frames that wouldn't otherwise be there. So nVidia have the cheek to vastly increase prices for their hardware and then have you rely on software to get the performance you should be getting out the hardware at the price you are paying. And you all fell for it. nVidia marketing is genius.

And amd followed and it's being claimed as the next best thing now....

Regardless of who does it, what does it matter if the positives outweigh the negatives?

Making software is not cheap/free, something has to fund R&D and the money isn't just being pulled from nowhere to pay for the developers and so on working on these technologies.....

dKfFxwg.png


KQkxob6.png


GPU prices are a joke though and shouldn't be quite what they are even when accounting for inflation, that goes for both nvidia and amd and we can thank the mining boom and people paying stupid silly money for gpus back then for setting the precedent.
 
Wrong. You’d have FSR FG just as good as DLSSFG then. Which is not the case. Just like with every amd vs nv feature, amd’s is the ‘we have x at home’ one.

Nvidia’s features are built to certain standards. And that’s why most people continue to buy Nv and completely ignore Amd. The numbers don’t lie. Wish things were different but, maybe intel can shake things up a bit?

Pretty much. Again, it's great what amd do but literally they have no other choice lol....

- show up months/years later with said features
- features then take months/years to get better and even then I don't think said features ever end up being better?
- marketshare of < 10% still?

I still see people use gsync as an example for the freesync winning and no need for gsync module, it was just a way to lock customers in and whilst true to an extent (shock horror a company wanting to make profit by getting people into an ecosystem and upgrading!!!! :eek:), which as evidenced by actual experts in the field, it still does have its place now as stated (unless once again, people have something to prove them wrong otherwise?)


And a very recent post by blurbusters on the matter:


Native G-SYNC is currently the gold standard in VRR at the moment, and it handles VRR-range-crossing events much better (e.g. framerate range can briefly exit VRR range with a lot less stutter).

Using G-SYNC Compatible can work well if you keep your framerate range well within VRR range.

The gold standard is to always purchase more VRR range (e.g. 360-500Hz) to make sure your uncapped framerate range (e.g. CS:GO 300fps) always stays permanently inside the VRR range. Then you don't have to worry about how good G-SYNC versus G-SYNC Compatible handles stutter-free VRR-range enter/exit situations.

Also NVIDIA performance on FreeSync monitors is not as good as AMD performance on FreeSync monitors, in terms of stutter-free VRR behaviors.

Eventually people will need to benchmark these VRR-range-crossing events.

If you have a limited VRR range and your game has ever touched maximum VRR range in frame rate before, then the best fix is to make sure you use VSYNC ON and a framerate cap approximately 3fps below (or even a bit more -- sometimes 10fps below for some displays). Use VSYNC OFF in-game, but VSYNC ON in NVIDIA Control Panel.

Free/adaptive sync wasn't out for like 1-2 years till after gsync? And when it launched it had a whole heap of issues with black screening, no LFC and terrible ranges.

Yes, to get the best, you have to pay the premium to get said experience months/years before the competition but what's new here? Same goes for anything in life. If AMD were first to the market with quality solutions, you can be damn sure they would be locking it down or going about it differently to get people to upgrade. AMDs reveal at their amd event about frame generation was literally nothing but a knee jerk reaction in order to keep up appearences that they aren't falling behind nvidia hence why we still have only 3 titles with official FSR 3 integration and questionable injection method (that will get you banned if used for online games).

So the article summed it up perfectly:

Well, we all know why a lot of PC gamers picked their pitchforks. It wasn’t due to the extra input latency and it wasn’t due to the fake frames. It was because DLSS 3 was exclusive to the RTX40 GPU series, and most of them couldn’t enjoy it. And instead of admitting it, they were desperately trying to convince themselves that the Frame Generation tech was useless. But now that FSR 3.0 is available? Now that they can use it? Well, now everyone seems to be happy about it. Now, suddenly, Frame Generation is great. Ironic, isn’t it?
So yeah, the release of the AMD FSR 3.0 was quite interesting. And most importantly, the mods that allowed you to enable FSR 3.0 in all the games that already used DLSS 3.0. Those mods exposed the people who hadn’t tested DLSS 3 and still hated it. Hell, some even found AFMF to be great (which is miles worse than both FSR 3.0 and DLSS 3). But hey, everything goes out the window the moment you get a free performance boost on YOUR GPU, right? Oh, the irony…

I was talking about if Nvidia enabled FG on Nvidia RTX cards not AMDs free for all take on it.

DLSS 3 is built around the optical flow accelator, Bryan and a couple of other engineers have stated they could enable it but it wouldn't work well then people would trash how bad dlss 3 is thus better to keep it locked in order to retain the "premium" look, of course, they could take a different approach like amd but alas they then have 2 solutions to maintain and again, from their POV, what benefit do they have to this? Who knows though, maybe they will do a "dlss 3 compatible" version.
 
Not everything has to be a conspiracy, Sometimes things are just simple, 3000 series lacks the dedicated silicon for it and enabling it on 3000 series would create more problems than it's worth, That's it.

Post from another thread but it sums it up well :D

Lol. So the reason you guys don't like nvidia is because you don't like capitalism. Got it.
 
IMO I just don't think it's that deep, Ada has on silicon hardware that Ampere doesn't, Enabling it on Ampere would inevitably start the screeching brigade "why isn't my 3080 running the same as a 4080 this is BS reeeeeeeeeeeee" and Nvidia likely cannot be bothered with that noise so it's easier to leave AMD to pickup that slack with their hardware agnostic FG implementation.

Sure look at what happend with FG on the 40xx at launch, we had the usual suspects selectively picking scenes which purposely threw dlss 3/fg to have a fit and then people taking these "fake frames" to show how awful it was and even the source of their hate spiel stated they had to slow the footage down in order to pick out the fake frame, which you wouldn't have noticed during normal gameplay e.g.

Fn6bt6Yh.png


And the 2 real frames before and after the fake frame:

gOfazBjh.png


nTGG70ih.png


i.e. changed the camera view point in order to produce a garbled frame lol

So can only imagine what it would have been like with ampere running FG on worse hardware. Also, since dlss3/FG is closed source, nvidia could very easily make it run worse and no one would know if it was intentional or not.

It's a bit like DLSS and the tensor cores all over again, "you don't need tensor cores as they aren't being used for dlss!!!!" and here we are:

 
I think by locking FG to ADA nvidia has tried to hide the fact that it isn’t actually that great in its current state and is more of a beta feature.

People who then go and buy the new ADA cards will say it’s good because they don’t want to admit they paid a lot of money for GPUs which in most cases are barely or no than last gen for the price FG aside.

Well that depends entirely on where you look and what you choose to ignore, there are plenty of people who say DLSS 3 is great and works well, not just on this forum but by several reputable sources too, even bang 4 buck (who owns all top end hardware values it) i.e. by people who have no need to justify their purchase as they either can afford it or/and didn't pay anything for it in the first place. Now if you tell me that you're trying to run FG on a 4060 at 4k where base fps is like 20, well yeah, FG isn't a miracle worker.... Depending which ada gpu you bought, ampere owners will have got an upgrade overall, even if they bought the same tier of gpu. @TNA how much faster is your 4070ti compared to your previous 3080ti at your res again? Not even factoring in frame gen.

Also, if the polls recently on native, upscaling etc. are anything to go by, turns out people don't even use nor want software solutions to improve fps therefore surely it isn't a factor when buying these new gpus? :confused:

I do agree though, FG as a whole is somewhat still too new/beta, as in, for best results, you need to be getting 50-60 fps already. My experience with the 4080 on geforce now was fantastic, even the lag wasn't a huge issue and that was on cloud streaming.... AMDs FSR 3 official integrations have been awful and being locked to fsr upscaling makes it a no go for many, even amd owners as shown. The mod injector method is hit and miss so again, much like other technology, to experience the best as of "now", you have to pay that premium. Personally for me, it's akin to turing when dlss and rt came about, there simply aren't enough demanding games to really justify the need for FG yet, CP 2077 is the main one and I've already completed that 3 times now so meh and AW 2 ran fairly well given slow paced game so by the time there are more games where FG will be necessary, we'll have the next gen series out, which will be even better and probably have something else extra.

EDIT:

Also, nvidia made it very clear in their benchmarks/pr slides what the performance was without FG so it wasn't even a case of misleading the crowd unless again, people only look at bars and nothing else. AMD on the other hand when it comes to their PR/slides, well... the less said the better.
 
Last edited:
I’ve tried FG at 4k on my mates 4090 so (best case scenario) at different points over the past year, on a plagues tale it would introduce noticeable artifacts and would randomly cause crashes, on the Witcher 3 if would artifact really badly especially in fast motion, on cyberpunk it would also cause noticeable artifacts and on ASA it would cause flickering and regular crashes.

So no I would not say it’s a great feature right now, it has potential but also has a long way to go before I’d consider it a good feature that is worth paying extra for.

Is this based on launch? Cause if so, most of the issues i.e. UI issues and so on were resolved.

I didn't see any artifacts in cp 2077 with the 4080 nor had any crashing, only briefly tried it on plague tale and darktide and it seemed good there too.

Ark survival ascended is a buggy awful game in its self that crashes and just runs poorly regardless of FG. Although I find the modded in FG to actually work very well for this on my 3080.

Daniel Owens coverage is the best overall on it, it's a worthy feature of having as it does make a positive impact to what it seeks out to resolve i.e. motion fluidity and a smoother experience and overcoming bottleneck issues especially where cpu optimisation is awful.


 
You may think you’re making a insightful point but you’re really not.

Whilst rendering techniques have always been a complex illusion tailored to create the best visual fidelity at the best performance, the number of frames a given engine can generate on a hardware platform and the associated input latency are very real and very measurable.

Just because an engine might render shadows at a low resolution and filter them, or calculate ambient occlusion at a low fidelity and mask it with textures and normal maps, it doesn’t diminish the fact that an increase in frames-per-second has always resulted in a subsequent improvement in input-latency – higher framerates have been typically desired for both those qualities of which, frame-generation, or ‘fake frames’ only provides the former.

Now, personally I don’t care much, since I’ve only tinkered with DLSS-FG and don’t presently have much use for it – I get 90+ (real) fps in all the games I play at the resolution and quality settings I play at.

I do think it’s important though to separate what frame generation does (improve apparent visual smoothness) from the claims (mostly by Nvidia) that it offers improved ‘performance’ (which it doesn’t – genuine performance gains = more frames and less latency).

Does it matter? To some people yes, and to some no, but to dismiss the specifics of the discussion with the trite ‘it’s all fake anyway’ argument is swallowing Nvidia’s marketing BS hook, line and sinker.

Blurbusters describes it best:


Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures

At the end of the day the gaming experience matters and if frame generation helps with that experience then so be it. Honestly I have no problem with either technology or future technologies that enhance my gameplay experience. If it's a fake frame who cares? Are you able to identify a "fake frame" that passes faster than you can visually identify without artificially slowing down the frames? Of course you can't but you would notice the smoothness jump probably. This is one of these technologies like freesync/gsync that will just take time to adopt and then becomes mainstream to the point we don't actually notice or even care.

If enabling frame generation decreases quality and you can see that to the point that it detracts from your gaming experience then you have a choice to not use it. I don't see any negatives to having the option to use frame generation if you so choose.

Exactly.

Even with amds FSR 3 injected into cp 2077, it is obvious the ghosting/trailing on the car tail and so on and whilst it is somewhat distracting/annoying, it makes path tracing playable on older/weaker gpus that otherwise would not be possible unless you like 10-50 fps depending on your gpu and res and what upscaling you're using
 
Last edited:
Not saying that it isn't a useful or worthwhile feature - just that the 'fake frames' moniker is actually an appropriate shorthand given its shortcomings.

It's also handy to keep AMD and Nvidia honest - don't forget Nvidia advertised the 4070 Ti as having double the 'performance' of the 3090 Ti.

*Edit* BTW - I agree with Blur Busters' statement:



The devil is in the details though - current methods of frame generation aren't perceptually lagless relative to the latency improvements you'd get by simply drawing triangles and textures faster ('lossless' is somewhat more subjective).

Isn't there shortcomings to everything though? Even native has its shortcomings and imo, arguably more shortcomings than this "fakery" stuff. It is always a case of pick your poison.

Latency can and definetly is an issue though to an extent and has various factors which can impact it i.e.

- if you use M+K or controller
- what your base fps is, if base fps is 60 fps, it's not a huge issue imo
- what type of games you play, obviously anything FPS/PVP will be a no go but for SP, especially games like MSFS, AW 2, I wouldn't be worrying about latency personally
- nvidia does lead here in this regard which makes it a better experience, although I found avatars FSR 3 latency to be very good and it felt snappier than without (but had to stop using due to other reasons)

The one thing which I really like about frame gen which I don't see many people mention is how it increases the 0.1 and 1% lows, this is something that can bug me a lot of the time e.g. in hogwarts, even though, no need to use FG here, these are much better
 
Last edited:
As an additional feature (like upscaling, Reflex etc.) I think frame generation is an obvious win for PC gamers - if you like it, use it - if you don't, you don't have to.

But as we've seen, Nvidia wants the blur the line between what's a genuine generational performance uplift (say, 4090 vs 3090) and what's a feature improvement (3060 Ti vs 4060 Ti) - they want to sell you less actual hardware and make up the performance deficit with software features that may or may not be supported by the games you play.

Had they been completely transparent with what frame generation was in their marketing, I think they'd had have gotten a lot less backlash regarding the feature - instead, they attempted to obfuscate the lack of any real generational improvement in the low-to-midrange stack of the 40-series product line and their customers were rightly ticked off about it.

AMD's own FG technology has also highlighted that clearly Nvidia could've shipped a software-only version of DLSS-FG for 20/30-series owners and chose not to. Is it any wonder the technology is regarded with derision and scepticism by some now?

One positive I'll note though is that whether you use DLSS-FG or not, it's sped up the adoption of Reflex in games that otherwise wouldn't have it and that's a big win for all Nvidia users.

Their powerpoint slides have always shown what DLSS preset is used as well as if FG is on or off so I wouldn't say entirely misleading, amds on the other hand, were beyond misleading. Nvidias PR/marketing material also describes the technology extremely well, DF promotional video also outlined it too so they were very transparent here how the performance improvement was being done:


And also, again, making software solutions is not cheap/free either, it probably costs more than improving hardware especially when you factor in the ongoing costs for developers time (it's closed source so only nvidia can improve it). Nvidia could very well do what amd did (although as shown, it still isn't quite as good as dlss 3 not to mention, the lack of uptake still) but it then means, they have to maintain another solution that works completely different to their current version unless they do a basic version based of FSR 3, which I doubt they'll want to as not much point since people can enable it FSR 3 or just mod it in themselves i.e. from nvidias pov, it doesn't make much sense, again, amd has no other choice but to go about the way they do, you can't be last to the market with an inferior version and expect to lock it down.

I agree, it's not ideal but at the end of the day, MLID is somewhat true and said companies have to look at other more efficient ways to get that much better performance, without these, we wouldn't be able to enjoy "real time" ray tracing right now for example (unless you gimp it which then makes it not worthwhile for gamers).

Agree on the reflex part although I believe the quicker/wider adoption is because of streamline so devs are able to integrate all the rtx tech in one swoop now.

That article looks like it's written by a Nvidia fanboy. Calling people 'whiny little kids' is immature and poor journalism. Maybe he should wake up and look at the reality. People were shown DLSS3 FG as a great new feature but the caveat was that you needed to purchase a very expensive 4000 series gpu. Majority of games do not have the money to spend a large amount of cash just to experience what is essentially a fake frame interpolation technique similar to the smooth frames feature on countless TV's.

Now that AMD's has let people experience it for FREE, many will have changed their opinion and that is absolutely fine and a sign of maturity. Only a butthurt fanboy would complain that gamers are appreciating AMD and it's open source FSR3 FG. I'm pretty sure everyone would have been very happy if Nvidia had done the same and allowed everyone to experience frame generation instead of locking the majority out.

So you don't like capitalism then?

The article is entirely about the irony and hypocrisy that happens.
 
Thanks Steve!


How do you measure performance in the grand scheme of "pc gaming"? Is it not FPS related?

Those charts are always exaggerated too, this is nothing new (from any brand).

I didn't realise they had that chart though where it didn't show FG etc. which is bad unless they followed up in the video by specifically saying dlss 3 was used?
 
Last edited:
Every legit tech-tuber at the time called them out for their deceptive marketing and that '3x' they quoted was only for a single game - the (at that time, unreleased) path-traced version of CP2077.

I'd also add that an asterisk on a Power Point slide isn't being transparent with your customers - it's the bare minimum you do to avoid being sued for misrepresenting your products. The entire Ada launch was slimy.

Just watched the video from nvidia and yeah it's much more deceptive than the original 4090/4080/dlss 3 video but he did specifically say with dlss 3 when talking about the increased performance but alas they should have at least shown at the bottom, the specs with FG/DLSS 3 specifically stated as being on.

I don't disagree it was pretty poor, amds was even worse especially where "power efficiency" claims were boasted about and that was a complete and utter lie. Same for intel and their cpus when compared to amd cpus.
 
When it comes to input lag, tftcentral figures are pretty accurate in my experience:

  • Class 1) Less than 16ms / 1 frame lag – should be fine for gamers, even at high levels
  • Class 2) A lag of 16 – 32ms / One to two frames – moderate lag but should be fine for many gamers
  • Class 3) A lag of more than 32ms / more than 2 frames – Some noticeable lag in daily usage, not suitable for high end gaming
I would personally still take 40-50ms with 80+ fps over say 20-30ms and FPS of 40 though as to me, the experience of playing the game with the higher FPS is much more enjoyable. OBviosuly for PVP, it would be a no go but then you'll be reducing settings or/and only using upscaling here.
 
  • Like
Reactions: mrk
I find it amusing but also in some ways insulting that no matter what actual proof is posted, someone will always be there to go "oh that was cherry picked", or come up with some other excuse to try to throw shade onto something.

Yup it's very tiring tbh. Like I said before, there should be a rule enforced now where if people are going to make sweeping claims that goes against several reputable sources as well as likes of yourself and mine where we post evidence to back up our statements, those who oppose the posts should be posting something of substance to back up their points too as it just comes across as fingers in ears and almost baiting/trolling, the onus is on them to debunk said points really and for those who just post one liners, well they should have said posts deleted instantly if they aren't willing to engage in such discussions. Head over to x/twitter for that level of discussion if that's your thing imo.

It's a bit like in the other thread with the poll where I posted my imgsli comparing native and dlss quality, which showed DLSS better overall than native in terms of rendering the text better and some other details but alas the fingers in ears continued:

cz18yC1.png


And DLDSR with DLSS Perf for comparison:

2xLQ0Xp.png


At the end of the day, we can all pick areas where DLSS and even FSR looks better or picks areas where native (even with TAA) looks better, the key thing is how often does said method look better i.e. if native is only better say 40% of the time where as dlss/fsr/xess is better 60% of the time then I know what I'll pick but again, it all comes down to what one values for IQ, sharpness/clarity, less jaggies, less shimmering, less softness, less ghosting, this is where Alex/DF video will be very good (inb4 shill)

3BhElm9.png
 
Last edited:
Back
Top Bottom