• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Nvidia geforce rtx 5090 appears in first geekbench opencl vulkan leaks


l4HFtRH.png
Does anyone more intelligent than me know what these benchmarks mean for different irl workloads? I assume the Vulkan benchmark is a pretty accurate representation of the i prove we may expect from games using Vulkan.
 
So in summary

* No realm improvements in the cores, any performance comes from having more of them

* the card's clocks don't boost high because its power limited - 575 watts is not enough and the GPU would like more power

* games averaged 20% faster performance without using fake frames


3nm can't come fast enough, Nvidia is cooked, this architecture has nowhere else to go, Nvidia is gassed. If RTX6000 isnt on 3nm, the 6090 will be a 1000w card
Those shunt mods are gonna be crazy for the 5090
 
I was also confused about the discussion about the cable on the 4090, i too haven't had one single problem since day of release and i thought all that was debunked with the gamer nexus video etc..
Wasn't it also people just not pushing it in correctly?
I guess people like to hold onto old news regardless on whether its true or not
All users needed to do was watch that GN video part 1 and come to that conclusion that it wasn't a connector issue, He literally was up all night for hours trying to get 1 to melt under extreme circumstances.

It just needs a decent cable and be plugged in properly that's it.

The other 0.0001% of issues not related to those dodgy adapters was not plugged in properly or sending 600w and OC their 450w cards.

I used Nvidias adaptor for a couple of months then Seasonics 12VHPWR - 12VHPWR cable for over a year and then decided to get one of the newer Seasonic 12V 2 x 6 cables when they were released a few months ago, Not for fear of the original cable it was just because I plugged and unplugged the original that many times that I wanted a new tight fit.
 
Personally I think Frame Gen is great, but it depends on your use case. (...) Clearly latency is bad with FG enabled. To my eyes there's no difference with image quality. (...)

With games where you have 85+ FPS, the time normal FG frames are visible for is too short to see most issues when it. But with MFG where you get multiple broken frames in a row, they persist for much longer and hence it should be much easier to spot issues in movement. Especially with faster camera movement. I can see halo, ghosting, broken objects etc. on my 4090, with my old eyes, if I enable FG below 60fps -easily enough and latency is then also unacceptable to me. Even in single player games, it just causes such input lag at times my brain interprets it as (I suspect) poisoning and I start to physically feel sick - it's very individual thing, each person reacts differently but many people have these issues even in VR if latency isn't perfectly low. It's just not for everyone. Especially with big UW monitor that makes one feel much more immersed in the FPS game.
 
Last edited:
All users needed to do was watch that GN video part 1 and come to that conclusion that it wasn't a connector issue, He literally was up all night for hours trying to get 1 to melt under extreme circumstances.

It just needs a decent cable and be plugged in properly that's it.

The other 0.0001% of issues not related to those dodgy adapters was not plugged in properly or sending 600w and OC their 450w cards.

I used Nvidias adaptor for a couple of months then Seasonics 12VHPWR - 12VHPWR cable for over a year and then decided to get one of the newer Seasonic 12V 2 x 6 cables when they were released a few months ago, Not for fear of the original cable it was just because I plugged and unplugged the original that many times that I wanted a new tight fit.
And then you can watch Derbauer video where he was able to easily show in practice multiple issues with these connectors (sudden crashes etc.) if the cable wasn't set at perfect angle and locked in place so it never moves. He also described exactly why it's a bad design and should have never been released in this state and how it could be fixed. Fun fact, it's not been fixed and still has same issues in newest generation, apparently.
 
Last edited:
All the leaks point to about 30% in pure performance over the 4090. RT may be closer to 40%.

The 5080 and below (if rumours on performance are accurate) are clearly Nvidia moving the lower stack up one tier and pretending to cut prices. So we could effectively be getting the 5080 as a proper 4070Ti replacement for a $200 price increase.

So an actual drop in price/performance across the entire stack.
 
Last edited:
I've tested Cyberpunk with FG with a base FPS of 90+ and the latency is so noticable that I would rather game at the base 90+ fps than the 180fps i got with FG. Important note: My testing was done with AMD's FG using AntiLag so it may well be that Nvidia's implementation is better on the latency side. I however, highly doubt that it would be good enough for me to not notice. The connected feel, to quote PCM2, is just worse with FG from my experience.
 
All the leaks point to about 30% in pure performance over the 4090. RT may be closer to 40%.

Actually, not exactly, no:

There's pretty much no IPC difference, all improvements steam just from higher power use and cuda cores count. It's like 4090 Super really. The only real improvement is in AI processing, it seems.
 
Last edited:
And then you can watch Derbauer video where he was able to easily show in practice multiple issues with these connectors (sudden crashes etc.) if the cable wasn't set at perfect angle and locked in place so it never moves. He also described exactly why it's a bad design and should have never been released in this state and how it could be fixed. Fun fact, it's not been fixed and still has same issues in newest generation, apparently.
Yes but users shouldn't have been trying to cram them in cases where they didn't fit properly.

Going of that logic any cable connector that you bend deliberately and trying not to make contact is going to cause issues on any electronics.

That's not to say the connector should probably have been place at the end of upright but that doesn't mean it's a bad design when you're deliberately bending the cable or using a cheap cable like Cablemods with their thin ass sense cables.
 
Actually, not exactly, no:

There's pretty much no IPC difference, all improvements steam just from higher power use and cuda cores count. It's like 4090 Super really. The only real improvement is in AI processing, it seems.
the 4090d was 6% slower in fps than the 4090 so not sure if that's the same case.
 
Frame gen and MFG are just marketing tools employed by Nvidia to inflate the numbers.

No doubt come the 60 series Nvidia will add a X8 MFG artificially locked so they can claim the new cards are twice as fast again despite only putting in the minimum effort to increase the real performance metrics and then make off like bandits.
 
Frame gen and MFG are just marketing tools employed by Nvidia to inflate the numbers.

No doubt come the 60 series Nvidia will add a X8 MFG artificially locked so they can claim the new cards are twice as fast again despite only putting in the minimum effort to increase the real performance metrics and then make off like bandits.

They're a publicly traded company, so their mission is to make money for shareholders. As long as they make profit, they're happy. They'll make huge profits from 5000 series, especially now that AMD is not competing.
 
Where has the drive for frame-generation come from? Is it just so Nvidia can say they've got bigger & better gains each generation? Or because people love to say 'well I get 200fps in Cyberpunk so :p'. Or a combo of both?

I totally get upscaling with DLSS/FSR/XeSS and the improvements made here. If you can render a game at 60% of the output resolution and have it look 99.9% as good as native then that's fantastic, sign me up. You get better performance, you can push certain things (RT etc.) harder and you make components relevant for longer.

Frame Gen doesn't seem to have any of those benefits. You're not making the game feel like it's playing at a higher frame rate, you're just pretending it is. The benefit of actual higher frames is the improvement to how smooth a game will feel and decreasing input latency. High frames + stability of those frames is honestly the main reason I stick to PC gaming.

Frame Gen (on both AMD and Nvidia) has always felt so odd to me, it looks smoother but it doesn't feel any smoother - and I really struggle with using it on mouse and key. For me it's only really been usable on controller.
 
I was also confused about the discussion about the cable on the 4090, i too haven't had one single problem since day of release and i thought all that was debunked with the gamer nexus video etc..
Wasn't it also people just not pushing it in correctly?
I guess people like to hold onto old news regardless on whether its true or not

The original 4090 connector clearly didn’t have sufficient ‘redundancies’ in its design to account for micro-mistakes made by users.

By redundancies, I mean in the engineering sense - you design something so that it stays safe if multiple things fail.

The fact the cable connector was redesigned with different length pins, and included from some mid-cycle 4090s, tells you that there was in fact a needless risk that was engineered away.

This is a good explanation of the changes:

https://www.corsair.com/uk/en/explo...teA8M7JwxRTIlLmLKBRZULCHzpfzsIUgH9aziV2u7QcK2

Compared to the original 12VHPWR connector, the new 12V-2x6 connector has shorter sensing pins (1.5mm) while the conductor terminals are 0.25mm longer. This might not sound like a huge difference, but it matters in ensuring that the power cable has been properly connected to whatever device is going to be pulling power from your system's power supply.
 
Last edited:
Where has the drive for frame-generation come from? Is it just so Nvidia can say they've got bigger & better gains each generation? Or because people love to say 'well I get 200fps in Cyberpunk so :p'. Or a combo of both?

I totally get upscaling with DLSS/FSR/XeSS and the improvements made here. If you can render a game at 60% of the output resolution and have it look 99.9% as good as native then that's fantastic, sign me up. You get better performance, you can push certain things (RT etc.) harder and you make components relevant for longer.

Frame Gen doesn't seem to have any of those benefits. You're not making the game feel like it's playing at a higher frame rate, you're just pretending it is. The benefit of actual higher frames is the improvement to how smooth a game will feel and decreasing input latency. High frames + stability of those frames is honestly the main reason I stick to PC gaming.

Frame Gen (on both AMD and Nvidia) has always felt so odd to me, it looks smoother but it doesn't feel any smoother - and I really struggle with using it on mouse and key. For me it's only really been usable on controller.

When @Absolutely Clueless leaves you a like and it is the only like on th3 said post, I wonder if it a complement or he is trying to say something else.
 
Back
Top Bottom