Say you ran a game at 120 fps. I then inserted identical frames twice and called it 240 fps, would you call it 240fps?
DLSS3 is doing that with some interpolation. There is no additional information it is using to render those frames. Your mouse click will not register in that frame. Physics in the game will not update in that frame. If you actually look at those frames, they are degraded heavily.
Let's say in a 60fps game, FG increases it to 120fps. What if FG had a setting to increase it to 180fps by generating 2 extra frames (nvidia could do this if they wanted). Is that now a 180fps gaming experience where two of the frames are a blur of real frames?
It should be input lag of about 120fps in that game and movement fluidity of 240fps.
Like I've said, maybe I'm fine even with 60fps with FG (real fps around 30, maybe around 30fps +), as long as I can v sync it to that 60fps in something like Hogwarts or having a gameplay that doesn't involve 180 no-scope type of things. Of course, as long as I don't notice the inserted frames during gameplay, I don't care. If I need more precision, like when I play some competitive game or the input lag is too big then yeah, disable FG and adjust settings to get the FPS up.
Because as much as people want to believe marketing they are "recreated frames" and not "native frames" just like smooth motion,etc in AV adds non-native frames. It's the same sort of logic. Those inserted frames are not as good quality as the primary frames.
Its like all the hype about Nvidia/AMD/Intel doing upscaling/reconstruction. All the PCMR were mocking consoles for doing all these sorts of tricks and using variable resolution and other rendering tricks,and proclaiming how PC never needed to do it and was a pure gaming experience. Now Ap...sorry Nvidia does suddenly its the second coming,and suddenly its a big marketing point. Consoles only did it because they are devices built to a cost and need to push the "marketed" performance targets.
These new dGPUs need to do it from launch because they are simply not powerful to hit "marketed performance" which increasingly seems to be selling subpar chips for more and more money.
All these add-ons should be on top of the basic rasterised/RT performance. I don't like the fact these seem to be part of the advertised launch day performance,so Nvidia/AMD can sell more and more trash tier dGPUs for more money by making people accept more and more image quality compromises. Both Nvidia and ATI tried these tricks in the past,ie,make the rendering in benchmarks slightly below native rendering to boost scores. The tech press caught them out for doing it.
The issue that concerns me is if in a few years,if they start to hardware lock these features to newer generations. Then it will start to age dGPUs even quicker. That means they can cut down on the hardware they need to sell you,and sell it more often.
This totally fits into the thinking of the accountants now running these companies(think dGPUs as a service type model). You can start to see some of this elsewhere.
Did consoles have the same quality as DLSS? I guess not. I did not like DLSS in version 1.x as it was crap. Considering how it is now, to me, is better to run with DLSS ON than lowering details or resolution,
if the resulting image is better. And it normally is. That's the difference, when you're
forced to either reduce details and resolution to get to a certain FPS or use DLSS, if the resulting image is better with DLSS, then I'll take that over some purist view of only playing native - it which case I'll be worse off. I don't notice the inserted frames on TVs, so I'm not affected. Haven't tried FG yet, as I don't own a 4xxx card, but if it works the same could be ok
for me.
Again, consoles don't have the same quality as DLSS, that's why it was considered to be crap.
The "crappy" Turing in my PC runs the 8GB "monsters" of Hogwarts and TLOU on 3 displays (5760x1080) thanks to that - sure, TLOU has black borders on the side since doesn't know to display beyond ultra wide, but still a lot higher than 1080p and all that
at 60fps.
All these addons
are on top of what you already get. The current hike in price just makes current models from both sides a bit "meh", that's secondary.
I still buy the cards based on their native performance and everyone should do that! At the same time I look at bonuses like DLSS and FG and that will always influence what decision I make.
People need to stop trumpeting the RTX 4090. The cost per frame is better on the RTX 4080, when comparing the cheapest models I could find.
RTX 4090 @ ~£1500 / 171 min FPS = £8.77 at 1440p.
RTX 4080 @ ~£1100 / 148 min FPS = £7.43 at 1440p.
Chart:
RTX 4090 @ ~£1500 / 113 min FPS = £13.27 at 4K.
RTX 4080 @ ~£1100 / 89 min FPS = £12.35 at 4K.
Chart:
4090 - 38/frame
4080 - 38/ frame
The difference will be higher or lower, depending on what you compare that card. Not to mention that even in your case the difference is too small to matter (as in pound/frame), ergo why on my example are about the same. Morover, 4090 will get you closer to that 60fps or more than 4080 can (and by no small margin). For a top of the line card that's good - but is only good because the lower offerings are very bad. In this case, I'll either get 4090 or look for something else in a lower tier.
This resonated with me in a scary way.
We lease cars, we lease motorbikes, you can get Geforce Now. It wouldn't surprise me if a company tries a leasing model.
Phones are probably an example to look at. The top end have skyrocketed in price, someone on an average wage still wants the top of the line iPhone but a month's pay is out of reach so most high end phones are on contract.
Phones are not a good example, because companies that make them are like mushrooms after the rain - plenty, and most important,
they do compete amongst themselves!
Why would I buy a top end iPhone? What would it get me beyond the roughly 360 euros that I've paid for a very good phone with 8gb RAM, 256GB storage, 67W fast charge and decent cameras? Nothing that I would notice in the day to day life other than bragging rights of which I'm not a fan. So buying the less expensive for which offers virtually the same experience is not affecting me, the buyer.
On the GPU side, on the other hand, is different. You can feel the difference between a midrange card and a top end card and basically you have only 2 worthwhile players atm that
don't compete amongst themselves! Maybe 3 if Intel hangs on delivers quality products. The answer to the situation is easy: we should stop defending these practices and find excuses like "OMG, inflation!", or basically "if they don't made 50-60% profit margin over all and at least 1billion in profits is not worth it for them, they will go bankrupt!"... plus other silly arguments. Some fanboys of these companies see them as "my company good, no matter what, THAT company BAAAAAAAAD", so they end up defending the bad practices.