• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CES 2025 AMD & Nvidia Keynote Live Chat [06/01/25]

Can use the original frame generation which is supposedly also getting improvements.

The 5070 isn't going to be 4090 performance if the 4090 user is also using all upscaling technologies available to it it as well.
But not the DLSS frame gen one.

Its mute anyway, becauuse my point is RASTA. if its not equal in raster performance then i do not conclude that a 5070 is as good as a 4090
 
But not the DLSS frame gen one.

Its mute anyway, becauuse my point is RASTA. if its not equal in raster performance then i do not conclude that a 5070 is as good as a 4090
You are the one that said "5070 is 4090 when u add dlss4 frame gen stuff to it" so i was just advising that it isn't really if the 4090 user is also using upscaling technologies.

Jensen has already said that it's impossible for the performance of a 5070 to be the same as a 4090 without AI.
 
You are the one that said "5070 is 4090 when u add dlss4 frame gen stuff to it" so i was just advising that it isn't really if the 4090 user is also using upscaling technologies.

Jensen has already said that it's impossible for the performance of a 5070 to be the same as a 4090 without AI.
They are unable to use teh exact same new DLSS 4 goodies Thats my point.

2nd point is, i dont care, i care about Raster performance without all this ai and frame gen. That is where the 5070 fails miserabe i bet
 
They are unable to use teh exact same new DLSS 4 goodies Thats my point.

2nd point is, i dont care, i care about Raster performance without all this ai and frame gen. That is where the 5070 fails miserabe i bet
Agreed, i think pure rasterisation across all 5000 series cards is going to be quite a small increase from there equivilant predecessor. Especially if you look at the graph where is shows far cry 6 with only RT and no upscaling at all.
 
Last edited:
Can use the original frame generation which is supposedly also getting improvements.

The 5070 isn't going to be 4090 performance if the 4090 user is also using all upscaling technologies available to it it as well.
Most importantly, FG is not increasing performance - never did, still doesn't. It increases fluidity of displayed image but it actually decreases performance by introducing input latency. In other words, games look more fluid but they feel more laggy (though how much it depends on input FPS - running fg on 30 is a disaster).
 
Last edited:
Agreed, i think pure rasterisation across all 5000 series cards is going to be quite a small increase from there equivilant predecessor. Especially if you look at the graph where is shows far cry 6 with only RT and no upscaling at all.
Their own slides suggest no IPC improvements worth mentioning, hence it's just more performance for more power use and price - not a good hardware progress, very disappointing even. It might be better in 4k with higher vRAM throughout but generally it feels like this architecture was designed for data centers primarily, hence we got very little hardware improvements for gaming. At least their software stack improvements for previous generations look interesting, much more than he GPUs IMHO.
 
Yeah. Looking at the relative performance charts I'm really not that excited for the 5090 now. Not a big fan of frame gen, but maybe MFG irons out some of the issues.
FG gets same improvements on 4000 series too, just with less frames generated. So the only improvement 5k series has over 4k series seems to be bigger number (but also bigger input latency with MFG).
 
The more i read and research it seems like people are going to be paying for literally just Multi frame generation as the raster performance seems to only be a small increase... I guess we will see but it seems to be more compelling to skip this gen for sure.
 
Their own slides suggest no IPC improvements worth mentioning, hence it's just more performance for more power use and price - not a good hardware progress, very disappointing even. It might be better in 4k with higher vRAM throughout but generally it feels like this architecture was designed for data centers primarily, hence we got very little hardware improvements for gaming. At least their software stack improvements for previous generations look interesting, much more than he GPUs IMHO.

I miss the days when this DLSS and FSR nonsense did not exists, they are like buying a diamond ring only to be given a glass one because it looks similar, but its not really. Nvidia really are a great marketing team.If raster performance is not 50%-60% better without this DLSS Malarky then no point ditching 4090 for one.
 
Last edited:
I miss the days when this DLSS and FSR nonsense did not exists
We're getting to the point though, with both the graphical demands from games, the ever chasing of higher frame rates and more advanced monitor tech, that it sort of has to exist. You can only brute force raster so much and if the goal is always more then they'll find a way to do that and then each generation hopefully improve on how that works.
 
We're getting to the point though, with both the graphical demands from games, the ever chasing of higher frame rates and more advanced monitor tech, that it sort of has to exist. You can only brute force raster so much and if the goal is always more then they'll find a way to do that and then each generation hopefully improve on how that works.
I agree with this. I'd love it if raster performance could keep climbing like it used to, but it simply just can't. The R&D costs are already astronomical and are getting worse as processing nodes shrink. Let alone what damage inflation has done. I think of it like a car, I don't care per say "how" the frames are being created. If my experience overall is better I'm willing to pay for that. (I'm not excusing the pricing I think its nuts) Even if Nvidia wasn't ripping us off, there is some truth to the fact the days of GPU's getting significantly better value for money each time are probably over, or not coming back like they were. I'll happily be proven wrong, but for the next 5/10 years this is just a reality I see.
 
Also worth noting Digital Foundry got a hands on with the 5080 and said despite there being 2 additional frames generated. (2x fg vs 4x fg) the latency wasn't noticeably worse.
6.4ms of added latency for 71% extra frames being generated

50.9ms average (2xfg) vs 57.3ms average (4xfg)
 
Last edited:
Back
Top Bottom