• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

I havent ever seen an example of it doing anything but for reference i did just get this from AI in me web browser

"Supplemental PCIe power provides additional power to components like GPUs, ensuring they can achieve peak performance and operate stably under heavy loads. This feature is particularly beneficial for high-end GPUs that demand more power than what the motherboard can provide through the standard PCIe slot alone. For instance, MSI’s X870(E) series motherboards come equipped with a supplemental 8-pin PCIe power connector, which can deliver up to 252W of extra power, ensuring that GPUs can run at their full potential without power shortages or system instability. This extra power helps in handling intensive tasks such as gaming, AI computations, and other GPU-intensive applications smoothly."

Yeah that's nonsense. Yes the pcie 8 pin connector is rated for 252w, but the motherboard pcie slot is only rated for 75w and actually in testing they pull much less than that as GPU's are configured to pull all of their power over their own 8pin pcie connectors (or 12vhpwr as they are now switching to). GPU'S capable of using more power have their own extra connectors.
Having an 8-pin pcie connector on the motherboard physically cannot supply the GPU with 252w of extra power as the slot would start to melt.
It might alleviate a little heat from the atx24pin connector IF you had all your pcie slots populated with network cards, optane drives etc., but supplying more power to the GPU, nonsense.

Back when SLI was a thing and GPU's did draw a little power over pcie slots it was a thing, but now it's not.
 
Last edited:
Yeah that's nonsense. Yes the pcie 8 pin connector is rated for 252w, but the motherboard pcie slot is only rated for 75w and actually in testing they pull much less than that as GPU's are configured to pull all of their power over their own 8pin pcie connectors (or 12vhpwr as they are now switching to). GPU'S capable of using more power have their own extra connectors.
Having an 8-pin pcie connector on the motherboard physically cannot supply the GPU with 252w of extra power as the slot would start to melt.
it's hilarious, I have the MSI x870 tomohawk and even after reading the official paperwork about the extra 8 pin slot... I still don;t know what the hell it does. lol

"Usually, the motherboard’s 24-pin 12V power connector can only supply a maximum of 168W. While this might be enough for basic operations, it falls short when trying to drive everything to its peak—especially with fans, RGB lighting, and a beastly GPU like the 4090 connected."

But that's why the GPU has it's own power supply, I don't get it.
 
Last edited:
it's hilarious, I have the MSI x870 tomohawk and even after reading the official paperwork about the extra 8 pin slot... I still don;t know what the hell it does. lol

"Usually, the motherboard’s 24-pin 12V power connector can only supply a maximum of 168W. While this might be enough for basic operations, it falls short when trying to drive everything to its peak—especially with fans, RGB lighting, and a beastly GPU like the 4090 connected."

But that's why the GPU has it's own power supply, I don't get it.
Generally the additional PCIE power on the motherboard allows it to provide a more stable power supply when you have multiple PCIE cards drawing power (especially those that don't have their own power source).

It's not really needed in instances where you have a single PCIE populated with a single GPU.
 
I’ve always found “designed for xxxx resolution” to be a weird statement (although I get your point, don’t get me wrong).

Folk said the same about the 4090, but I’ve got a 1440p 240hz monitor so I’d say the 4090 was designed for high refresh 1440p gaming.
But if you look at benchmarks, below 4k the gap between the 5090 and other cards starts to reduce, so the FPS/£ ratio gets worse. The extra performance is like 25% extra at 1440p, Vs 50% extra at 4K. Still, double the price for 50% extra performance is still bad, but it's not as bad as 25%.
 
I don't understand why most cards have 3 displayports and 1 hdmi. more HDMI ports is much more beneficial for me.
More and more people are using high end TV's as monitors nowdays.
Typically PC users use monitors with displayport so it's a market driven thing - most manufacturers do models with 2 hdmi but you have to check model by model to find them
 
That’s not how it will work. The US is a massive market so losses there cannot simply be diverted to the rest of the world. To make up for those loses they will raise prices elsewhere. They could even subsidise lower US MSRP by increasing MSRP everywhere else.

Oh look. He already did. Lol.

Like clockwork. I knew it was coming as I was catching up reading the thread.
 
I don't understand why most cards have 3 displayports and 1 hdmi. more HDMI ports is much more beneficial for me.
More and more people are using high end TV's as monitors nowdays.

I'd wager that, overall, only a tiny fraction of people are using a TV as a monitor and, for those that do, the single HDMI port is perfectly adequate.
 
I don't understand why most cards have 3 displayports and 1 hdmi. more HDMI ports is much more beneficial for me.
More and more people are using high end TV's as monitors nowdays.
Yer, granted it's a somewhat niche use case, i but i run 2 monitors via dp, then hdmi to my tv and another to my av receiver. Gets around the fact my receiver doesnt support earc so i only get 2.1 sound over arc.
 
Last edited:
I'd wager that, overall, only a tiny fraction of people are using a TV as a monitor and, for those that do, the single HDMI port is perfectly adequate.

Those spending £3k on a GPU are more likely to have one or more large OLED's with HDMI ports IMO.
 
Last edited:
But if you look at benchmarks, below 4k the gap between the 5090 and other cards starts to reduce, so the FPS/£ ratio gets worse. The extra performance is like 25% extra at 1440p, Vs 50% extra at 4K. Still, double the price for 50% extra performance is still bad, but it's not as bad as 25%.

Diminishing returns for total FPS but I've seen a number of cases where the 1% lows of the 5090 is greater than the maximum FPS of the 4090 at 1440p
 
I don't understand why most cards have 3 displayports and 1 hdmi. more HDMI ports is much more beneficial for me.
More and more people are using high end TV's as monitors nowdays.
it's so annoying, my monitor has 3 hdmi 1 display port. I used my display port with my work laptop so i can get full res 120hz. The 50 series has hdmi 2.1 so hopefully 7680 x 2160 at 240hz should work.
 
I don't understand why most cards have 3 displayports and 1 hdmi. more HDMI ports is much more beneficial for me.
More and more people are using high end TV's as monitors nowdays.
Think this also might have been down to a bandwidth spec also for quite some time display port provided the most bandwidth spec, and if needed to be hdmi could easily be converted to such using an adapter whereas if you stuck with the lower spec hdmi port 1.4 for example you couldnt go up in spec to a higher standard of display port if that makes sense only really since hdmi 2.1 has the hdmi port surpassed the display port 1.4 specs even
 
it's so annoying, my monitor has 3 hdmi 1 display port. I used my display port with my work laptop so i can get full res 120hz. The 50 series has hdmi 2.1 so hopefully 7680 x 2160 at 240hz should work.
The 4090 also has hdmi 2.1 but nvidias implementation of it doesnt allow that resolution at 240hz, while its possible on the hdmi 2.1 ports on the amd 7900xtx for example, not sure if its down to dsc but there was quite a lot of kickup about it at the time
 
Back
Top Bottom