Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Can use the original frame generation which is supposedly also getting improvements.They cant use flss4 frame gen. oist only avsilable on the 50 series
Plus whenever 5070 needs more than 12gb vram its gonna hit a brick wall that a 4090 wontCan use the original frame generation which is supposedly also getting improvements.
The 5070 isn't going to be 4090 performance if the 4090 user is also using all upscaling technologies available to it it as well.
But not the DLSS frame gen one.Can use the original frame generation which is supposedly also getting improvements.
The 5070 isn't going to be 4090 performance if the 4090 user is also using all upscaling technologies available to it it as well.
You are the one that said "5070 is 4090 when u add dlss4 frame gen stuff to it" so i was just advising that it isn't really if the 4090 user is also using upscaling technologies.But not the DLSS frame gen one.
Its mute anyway, becauuse my point is RASTA. if its not equal in raster performance then i do not conclude that a 5070 is as good as a 4090
Nvidia said "AI" 205 times which is 60 more than AMD therefore Nvidia is 40% better.I Thought the Nvidia CES was boring, first one I have watched only a small part about the new GPU's, all the rest was about AI.
I also like the tiny bit they showed of the new Virtua Fighter.
They are unable to use teh exact same new DLSS 4 goodies Thats my point.You are the one that said "5070 is 4090 when u add dlss4 frame gen stuff to it" so i was just advising that it isn't really if the 4090 user is also using upscaling technologies.
Jensen has already said that it's impossible for the performance of a 5070 to be the same as a 4090 without AI.
Agreed, i think pure rasterisation across all 5000 series cards is going to be quite a small increase from there equivilant predecessor. Especially if you look at the graph where is shows far cry 6 with only RT and no upscaling at all.They are unable to use teh exact same new DLSS 4 goodies Thats my point.
2nd point is, i dont care, i care about Raster performance without all this ai and frame gen. That is where the 5070 fails miserabe i bet
Most importantly, FG is not increasing performance - never did, still doesn't. It increases fluidity of displayed image but it actually decreases performance by introducing input latency. In other words, games look more fluid but they feel more laggy (though how much it depends on input FPS - running fg on 30 is a disaster).Can use the original frame generation which is supposedly also getting improvements.
The 5070 isn't going to be 4090 performance if the 4090 user is also using all upscaling technologies available to it it as well.
Their own slides suggest no IPC improvements worth mentioning, hence it's just more performance for more power use and price - not a good hardware progress, very disappointing even. It might be better in 4k with higher vRAM throughout but generally it feels like this architecture was designed for data centers primarily, hence we got very little hardware improvements for gaming. At least their software stack improvements for previous generations look interesting, much more than he GPUs IMHO.Agreed, i think pure rasterisation across all 5000 series cards is going to be quite a small increase from there equivilant predecessor. Especially if you look at the graph where is shows far cry 6 with only RT and no upscaling at all.
FG gets same improvements on 4000 series too, just with less frames generated. So the only improvement 5k series has over 4k series seems to be bigger number (but also bigger input latency with MFG).Yeah. Looking at the relative performance charts I'm really not that excited for the 5090 now. Not a big fan of frame gen, but maybe MFG irons out some of the issues.
Their own slides suggest no IPC improvements worth mentioning, hence it's just more performance for more power use and price - not a good hardware progress, very disappointing even. It might be better in 4k with higher vRAM throughout but generally it feels like this architecture was designed for data centers primarily, hence we got very little hardware improvements for gaming. At least their software stack improvements for previous generations look interesting, much more than he GPUs IMHO.
We're getting to the point though, with both the graphical demands from games, the ever chasing of higher frame rates and more advanced monitor tech, that it sort of has to exist. You can only brute force raster so much and if the goal is always more then they'll find a way to do that and then each generation hopefully improve on how that works.I miss the days when this DLSS and FSR nonsense did not exists
I agree with this. I'd love it if raster performance could keep climbing like it used to, but it simply just can't. The R&D costs are already astronomical and are getting worse as processing nodes shrink. Let alone what damage inflation has done. I think of it like a car, I don't care per say "how" the frames are being created. If my experience overall is better I'm willing to pay for that. (I'm not excusing the pricing I think its nuts) Even if Nvidia wasn't ripping us off, there is some truth to the fact the days of GPU's getting significantly better value for money each time are probably over, or not coming back like they were. I'll happily be proven wrong, but for the next 5/10 years this is just a reality I see.We're getting to the point though, with both the graphical demands from games, the ever chasing of higher frame rates and more advanced monitor tech, that it sort of has to exist. You can only brute force raster so much and if the goal is always more then they'll find a way to do that and then each generation hopefully improve on how that works.
they aren't targeting gamers anymore, they forgot about you now the big money is comingI Thought the Nvidia CES was boring, first one I have watched only a small part about the new GPU's, all the rest was about AI.