• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

The original 4090 connector clearly didn’t have sufficient ‘redundancies’ in its design to account for micro-mistakes made by users.

By redundancies, I mean in the engineering sense - you design something so that it stays safe if multiple things fail.

The fact the cable connector was redesigned with different length pins, and included from some mid-cycle 4090s, tells you that there was in fact a needless risk that was engineered away.

This is a good explanation of the changes:

https://www.corsair.com/uk/en/explo...teA8M7JwxRTIlLmLKBRZULCHzpfzsIUgH9aziV2u7QcK2
I might still bite if one became available at a reasonable price. When you know what to look out for it usually becomes trivial. The 4090 is some serious piece of kit.

EDIT: Ninja edit due to brain fog :P
 
Last edited:
When @Absolutely Clueless leaves you a like and it is the only like on th3 said post, I wonder if it a complement or he is trying to say something else.
He sums up my thoughts exactly. DLSS upscaling is great. I use DLSS Q in most games it is available, even if it is just to help lower power consumption or try new RT settings. Frame gen on the other hand is absolutely pants if you are latency sensitive like me and primarily play first person games on K&M. Nothing more to it.

It is also definitely nothing to do with the fact the post is 15mins old, hence only one like….
 
He sums up my thoughts exactly. DLSS upscaling is great. I use DLSS Q in most games it is available, even if it is just to help lower power consumption or try new RT settings. Frame gen on the other hand is absolutely pants if you are latency sensitive like me and primarily play first person games on K&M. Nothing more to it.

It is also definitely nothing to do with the fact the post is 15mins old, hence only one like….

Lol. Absolutely clueless. I know mate :cry:
 
Where has the drive for frame-generation come from? Is it just so Nvidia can say they've got bigger & better gains each generation? Or because people love to say 'well I get 200fps in Cyberpunk so :p'. Or a combo of both?

I totally get upscaling with DLSS/FSR/XeSS and the improvements made here. If you can render a game at 60% of the output resolution and have it look 99.9% as good as native then that's fantastic, sign me up. You get better performance, you can push certain things (RT etc.) harder and you make components relevant for longer.

Frame Gen doesn't seem to have any of those benefits. You're not making the game feel like it's playing at a higher frame rate, you're just pretending it is. The benefit of actual higher frames is the improvement to how smooth a game will feel and decreasing input latency. High frames + stability of those frames is honestly the main reason I stick to PC gaming.

Frame Gen (on both AMD and Nvidia) has always felt so odd to me, it looks smoother but it doesn't feel any smoother - and I really struggle with using it on mouse and key. For me it's only really been usable on controller.

Eventually we get to a point where frame generation will have minimal to no latency

probably courtesy of Asynchronous reprojection or some other work arounds.
 
Eventually we get to a point where frame generation will have minimal to no latency

probably courtesy of Asynchronous reprojection or some other work arounds.
You mean the game automating / trying to predict what someone will do?

Even the Wooting and Razer keyboard hardware feature was a bannable thing in CSGO and Valorant.

You realise not every game is offline right?

The biggest games on PC all have an online connection and im not even on about competitive stuff either, this isn't possible and will flag Anti Cheats hard not to mention the can of worms it opens up to.

Also I don't want the game trying to assume what I do,, I even recall a Nvidia feature to use AI to play the game for you, just what are people in these spaces.
 
Eventually we get to a point where frame generation will have minimal to no latency

probably courtesy of Asynchronous reprojection or some other work arounds.
Latency is derived from the base frame rate so the only way to reduce latency is to increase the base fps or push more aggressive upscaling.

If anything frame gen seems to hinder the natural performance progression as it’s easier and cheaper for Nvidia sell software performance rather than beefing up the hardware which is why the new cards are not seeing much of an uplift.
 
Last edited:
Latency is derived from the base frame rate so the only way to reduce latency is to increase the base fps or push more aggressive upscaling.

If anything frame gen seems to hinder the natural performance progression as it’s easier and cheaper for Nvidia sell software performance rather than beefing up the hardware which is why the new cards are not seeing much of an uplift.
I think Nvidia will start pushing controllers for PC gaming alongside the 60 series in order to hide the additional latency of 8x MFG. Can’t wait.
 
Screenshot-965.png
 
Was reading some great back and forth at Beyond3D forums, the crux being we really need a new definition of "performance" and how using these new features is essentially robbing Peter to pay Paul e.g. more FPS for more/worse latency.

It's a Wild West currently with people generally ignorant to the advantages and disadvantages of DLSS/frame gen... and nV is capitalizing on this ignorance fully because, for so very long, FPS was the end all be all.
 
Was reading some great back and forth at Beyond3D forums, the crux being we really need a new definition of "performance" and how using these new features is essentially robbing Peter to pay Paul e.g. more FPS for more/worse latency.

It's a Wild West currently with people generally ignorant to the advantages and disadvantages of DLSS/frame gen... and nV is capitalizing on this ignorance fully because, for so very long, FPS was the end all be all.
It's a really interesting problem and i'm curious to see how some of the more technical media counter this in their reviews/analysis. How long will the 'just turn all these features off' approach work for?

Even switching to something as 'simple' as latency isn't flawless, given there are so many different ways to measure it on a PC. Everyone understands FPS, at least until frame gen came along. I don't think the same can be said for latency.
 
I agree with you that the image quality is ok, but the latency is truly awful. Or i'm just very sensitive. It feels terrible with it enable in any first person game, no matter how high my base fps is. On the flipside, it really is magic in MSFS 2020/2024. Works very well.
Might be because you are at 240hz? Won't a 4090 struggle at 4k at such high refresh? What I'm getting at is your monitor is refreshing 240 times per second but your gpu is only outputting say 200 frames per second so you have a 40fps difference. Try setting your monitor to 120hz and see if you feel latency is better with FG on. Irrespective of AI technologies being on or off, if you want smooth gameplay, your FPS needs to be matching or very close to your monitor refresh rate, otherwise you are going to experience problems
 
Last edited:
I'm very interested in 5090 temps in general, after that Chinese reviewer saying it is power limited at 575w, hence the lower clocks. How can it be power limited at 575w. If you can't power limit/underclock it like the 4090 then i think that may be the nail in the coffin for me. The 4090 already chucks out enough heat in the summer at 350w.
I know, I've had my 4090 pulling over 600W for stability testing. Power limited 5090 at 575W??!
 
Back
Top Bottom