• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Because it does when you're actually just playing it and seeing the motion on an OLED monitor.

I reckon you'd equally be tricked by black frame insertion or how LFC works.


On my OLED just adding 60 black frames for every 60 real frames, makes it look much smoother.

 
Last edited:
No because I don't have low framerate issues so isn't an experience I can comment on.

They aren't just of benefit at low frame rates. They are of benefit whenever the frame rate is below your display's refresh rate.

So unless you are saying you are maxing out your refresh rate, it is "low".
 
Last edited:
Probably because is only truly efficient at high enough frame rates. However, depending how sensitive you are, could be fine even at lower than 60fps (that's final FPS with frame gen active).

Because as much as people want to believe marketing they are "recreated frames" and not "native frames" just like smooth motion,etc in AV adds non-native frames. It's the same sort of logic. Those inserted frames are not as good quality as the primary frames.

Its like all the hype about Nvidia/AMD/Intel doing upscaling/reconstruction. All the PCMR were mocking consoles for doing all these sorts of tricks and using variable resolution and other rendering tricks,and proclaiming how PC never needed to do it and was a pure gaming experience. Now Ap...sorry Nvidia does suddenly its the second coming,and suddenly its a big marketing point. Consoles only did it because they are devices built to a cost and need to push the "marketed" performance targets.

These new dGPUs need to do it from launch because they are simply not powerful to hit "marketed performance" which increasingly seems to be selling subpar chips for more and more money.

All these add-ons should be on top of the basic rasterised/RT performance. I don't like the fact these seem to be part of the advertised launch day performance,so Nvidia/AMD can sell more and more trash tier dGPUs for more money by making people accept more and more image quality compromises. Both Nvidia and ATI tried these tricks in the past,ie,make the rendering in benchmarks slightly below native rendering to boost scores. The tech press caught them out for doing it.

The issue that concerns me is if in a few years,if they start to hardware lock these features to newer generations. Then it will start to age dGPUs even quicker. That means they can cut down on the hardware they need to sell you,and sell it more often.

This totally fits into the thinking of the accountants now running these companies(think dGPUs as a service type model). You can start to see some of this elsewhere.
 
Last edited:
The only game I've tried frame generation on so far is Cyberpunk and it's utterly broken with motion blur so I promptly turned it off again :(
 
Last edited:
I think frame generation has its place though it needs a decent baseline or the input latency sucks. Reminds me of the AFR SLI induced latency in a way but only needing a single card. :p As for motion blur that is the first thing that gets turned off in game settings for me! BFI is good for blur reduction but image is dimmer and my newer 42" C2 doesn't have the 120Hz BFI support like the old CX did which is a shame.
 
The only game I've tried frame generation on so far is Cyberpunk and it's utterly broken with motion blur so I promptly turned it off again :(
System/display?

For ref here's a playthrough with action/fast paced movement etc and other than the video being recorded at 60fps whilst the framerate is 100+ so may be some video scuttling because of that, all looks like my non FGd Cyberpunk gameplay videos... I suppose someone will be along shortly to point out the fake frames in the video clearly visible :p


As mentioned before though, no problem with the input latency, it's only about 20ms over non-FG mode, and reflex seems to have it well under control. You can tell there is some introduction of latency, but it's not something that is an issue in this type of game.
 
Last edited:
Last edited:
I also think the RTX 4070 would have a reasonable cost per frame at 1440p relative to last gen cards (and higher tier cards), if priced at £500.

£500 / 104 min FPS = £4.80.
Figures from here:

I think it would be more reasonable to charge the same as the RTX 3070 FE for the RTX 4070, considering the performance is not a big improvement.
 
Last edited:
I also think the RTX 4070 would have a reasonable cost per frame at 1440p relative to last gen cards (and higher tier cards), if priced at £500.

£500 / 104 min FPS = £4.80.
Figures from here:
Yes it's all about the price to performance. I got a good deal on my 6950XT, that comes in at £4.66/frame. Admittedly I had to jump through a few hoops to get down to that, but it can be done.
 
4070 would be a very good card at $449 to $499 where it should be. What makes that card elevate itself among its peers as well is the efficiency- you lose basically no performance by doing an undervolt to bring the max power draw down to 130w. So that many frames for $449 or $499 and at 130w would be a fantastic buy and force AMD to do something
 
Last edited:
Yeah, I guess the problem is that the RTX 4060 TI is still to come (in May?), and they will want to charge £400+ for that.

Also, RTX 3070s still selling for ~£500...
 
Last edited:
This totally fits into the thinking of the accountants now running these companies(think dGPUs as a service type model). You can start to see some of this elsewhere.
This resonated with me in a scary way.

We lease cars, we lease motorbikes, you can get Geforce Now. It wouldn't surprise me if a company tries a leasing model.

Phones are probably an example to look at. The top end have skyrocketed in price, someone on an average wage still wants the top of the line iPhone but a month's pay is out of reach so most high end phones are on contract.
 
Please don’t give them ideas. Zero chance I would ever lease or do it like a phone like contract. I have not even had a phone contract in probably a decade or so now. I just do sim only contracts for less then a tenner a month.

I would probably end up going used and not upgrading as often if they did that. They would simply **** way too many people of going down such a route.
 
System/display?

For ref here's a playthrough with action/fast paced movement etc and other than the video being recorded at 60fps whilst the framerate is 100+ so may be some video scuttling because of that, all looks like my non FGd Cyberpunk gameplay videos... I suppose someone will be along shortly to point out the fake frames in the video clearly visible :p


As mentioned before though, no problem with the input latency, it's only about 20ms over non-FG mode, and reflex seems to have it well under control. You can tell there is some introduction of latency, but it's not something that is an issue in this type of game.
Like you I’m very impressed with frame generation.
I was a bit sceptical until I tried it myself in cyberpunk and it really is a game changer in that game.
As long as your base framerate is relatively high the experience is very good and fake frames are not visible unless you look for them hard.
That being said if your base framerate isn’t high enough it all starts to fall apart.
The threshold of that will be different for everyone obviously depending on how sensitive they are to latency and visible artifacts.
Another thing that breaks frame generation for me is inconsistent frame time in games (cyberpunk is very good in this regard which is why frame gen works so well in it) where frame generation seems to exaggerate micro stutter.
Anyway reading peoples opinions around here it seems to me that the ones who tried it are very impressed with it (with few exceptions like the guy above who tried it in cyberpunk with motion blur enabled in game which defeats the purpose), and the ones who diss it didn’t see it in person and only viewed YouTube videos which are 60fps.
Now before people jump on me saying “but but fake frames, but ngreedia advertising it as real performance and brainwashing people” well no in all the slides I’ve seen it was marked DLSS3 so if people fall for that kind of stuff without reading the small print then I think they have bigger problems in their lives than fake frames in their gpus.
I don’t take it as extra performance only as additional feature that works really well when impemented correctly and base framerate and frametime is good.
 
Please don’t give them ideas. Zero chance I would ever lease or do it like a phone like contract. I have not even had a phone contract in probably a decade or so now. I just do sim only contracts for less then a tenner a month.

I would probably end up going used and not upgrading as often if they did that. They would simply **** way too many people of going down such a route.
I mean realistically I see more chance of high end gaming getting more and more niche and becoming a rich person only or 1 hobby person only town, than GPU leasing.
 
  • Like
Reactions: TNA
Record your gameplay and then look at the frames created by DLSS3.
That's exactly the point. If I have to go looking for a problem, there isn't a problem.

If we are really going down that road, then how about the fact that the 60-90 FPS on an OLED has visual judder? It's very evident. Any kind of frame generation will always look visually smoother than the real frames.
 
Last edited:
Say you ran a game at 120 fps. I then inserted identical frames twice and called it 240 fps, would you call it 240fps?

DLSS3 is doing that with some interpolation. There is no additional information it is using to render those frames. Your mouse click will not register in that frame. Physics in the game will not update in that frame. If you actually look at those frames, they are degraded heavily.

Let's say in a 60fps game, FG increases it to 120fps. What if FG had a setting to increase it to 180fps by generating 2 extra frames (nvidia could do this if they wanted). Is that now a 180fps gaming experience where two of the frames are a blur of real frames?

It should be input lag of about 120fps in that game and movement fluidity of 240fps.
Like I've said, maybe I'm fine even with 60fps with FG (real fps around 30, maybe around 30fps +), as long as I can v sync it to that 60fps in something like Hogwarts or having a gameplay that doesn't involve 180 no-scope type of things. Of course, as long as I don't notice the inserted frames during gameplay, I don't care. If I need more precision, like when I play some competitive game or the input lag is too big then yeah, disable FG and adjust settings to get the FPS up.

Because as much as people want to believe marketing they are "recreated frames" and not "native frames" just like smooth motion,etc in AV adds non-native frames. It's the same sort of logic. Those inserted frames are not as good quality as the primary frames.

Its like all the hype about Nvidia/AMD/Intel doing upscaling/reconstruction. All the PCMR were mocking consoles for doing all these sorts of tricks and using variable resolution and other rendering tricks,and proclaiming how PC never needed to do it and was a pure gaming experience. Now Ap...sorry Nvidia does suddenly its the second coming,and suddenly its a big marketing point. Consoles only did it because they are devices built to a cost and need to push the "marketed" performance targets.

These new dGPUs need to do it from launch because they are simply not powerful to hit "marketed performance" which increasingly seems to be selling subpar chips for more and more money.

All these add-ons should be on top of the basic rasterised/RT performance. I don't like the fact these seem to be part of the advertised launch day performance,so Nvidia/AMD can sell more and more trash tier dGPUs for more money by making people accept more and more image quality compromises. Both Nvidia and ATI tried these tricks in the past,ie,make the rendering in benchmarks slightly below native rendering to boost scores. The tech press caught them out for doing it.

The issue that concerns me is if in a few years,if they start to hardware lock these features to newer generations. Then it will start to age dGPUs even quicker. That means they can cut down on the hardware they need to sell you,and sell it more often.

This totally fits into the thinking of the accountants now running these companies(think dGPUs as a service type model). You can start to see some of this elsewhere.
Did consoles have the same quality as DLSS? I guess not. I did not like DLSS in version 1.x as it was crap. Considering how it is now, to me, is better to run with DLSS ON than lowering details or resolution, if the resulting image is better. And it normally is. That's the difference, when you're forced to either reduce details and resolution to get to a certain FPS or use DLSS, if the resulting image is better with DLSS, then I'll take that over some purist view of only playing native - it which case I'll be worse off. I don't notice the inserted frames on TVs, so I'm not affected. Haven't tried FG yet, as I don't own a 4xxx card, but if it works the same could be ok for me.

Again, consoles don't have the same quality as DLSS, that's why it was considered to be crap.
The "crappy" Turing in my PC runs the 8GB "monsters" of Hogwarts and TLOU on 3 displays (5760x1080) thanks to that - sure, TLOU has black borders on the side since doesn't know to display beyond ultra wide, but still a lot higher than 1080p and all that at 60fps.

All these addons are on top of what you already get. The current hike in price just makes current models from both sides a bit "meh", that's secondary. I still buy the cards based on their native performance and everyone should do that! At the same time I look at bonuses like DLSS and FG and that will always influence what decision I make.
People need to stop trumpeting the RTX 4090. The cost per frame is better on the RTX 4080, when comparing the cheapest models I could find.

RTX 4090 @ ~£1500 / 171 min FPS = £8.77 at 1440p.
RTX 4080 @ ~£1100 / 148 min FPS = £7.43 at 1440p.
Chart:

RTX 4090 @ ~£1500 / 113 min FPS = £13.27 at 4K.
RTX 4080 @ ~£1100 / 89 min FPS = £12.35 at 4K.
Chart:
cyberpunk-2077-rt-3840-2160.png


4090 - 38/frame
4080 - 38/ frame

The difference will be higher or lower, depending on what you compare that card. Not to mention that even in your case the difference is too small to matter (as in pound/frame), ergo why on my example are about the same. Morover, 4090 will get you closer to that 60fps or more than 4080 can (and by no small margin). For a top of the line card that's good - but is only good because the lower offerings are very bad. In this case, I'll either get 4090 or look for something else in a lower tier.

This resonated with me in a scary way.

We lease cars, we lease motorbikes, you can get Geforce Now. It wouldn't surprise me if a company tries a leasing model.

Phones are probably an example to look at. The top end have skyrocketed in price, someone on an average wage still wants the top of the line iPhone but a month's pay is out of reach so most high end phones are on contract.
Phones are not a good example, because companies that make them are like mushrooms after the rain - plenty, and most important, they do compete amongst themselves!

Why would I buy a top end iPhone? What would it get me beyond the roughly 360 euros that I've paid for a very good phone with 8gb RAM, 256GB storage, 67W fast charge and decent cameras? Nothing that I would notice in the day to day life other than bragging rights of which I'm not a fan. So buying the less expensive for which offers virtually the same experience is not affecting me, the buyer.

On the GPU side, on the other hand, is different. You can feel the difference between a midrange card and a top end card and basically you have only 2 worthwhile players atm that don't compete amongst themselves! Maybe 3 if Intel hangs on delivers quality products. The answer to the situation is easy: we should stop defending these practices and find excuses like "OMG, inflation!", or basically "if they don't made 50-60% profit margin over all and at least 1billion in profits is not worth it for them, they will go bankrupt!"... plus other silly arguments. Some fanboys of these companies see them as "my company good, no matter what, THAT company BAAAAAAAAD", so they end up defending the bad practices.
 
Back
Top Bottom