• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

it's hilarious, I have the MSI x870 tomohawk and even after reading the official paperwork about the extra 8 pin slot... I still don;t know what the hell it does. lol

"Usually, the motherboard’s 24-pin 12V power connector can only supply a maximum of 168W. While this might be enough for basic operations, it falls short when trying to drive everything to its peak—especially with fans, RGB lighting, and a beastly GPU like the 4090 connected."

But that's why the GPU has it's own power supply, I don't get it.

I did not populate that extra 8 pin slot. All running great regardless. Even my beastly GPU, the RTX 4090.
 
it's hilarious, I have the MSI x870 tomohawk and even after reading the official paperwork about the extra 8 pin slot... I still don;t know what the hell it does. lol

"Usually, the motherboard’s 24-pin 12V power connector can only supply a maximum of 168W. While this might be enough for basic operations, it falls short when trying to drive everything to its peak—especially with fans, RGB lighting, and a beastly GPU like the 4090 connected."

But that's why the GPU has it's own power supply, I don't get it.
Some people guessed it was to supply more power to the USB 4 ports but it didn't seem to, at least not at the levels I was hoping for. I use the front USB-C port which provides 28 watts to power my quest 3 and I leave the extra 8 pin connected at the moment but don't really think it is doing anything.

Edit - Noticed yesterday that MSI have released a x870e version of the Tomahawk which seems identical apart from having less restrictions on the NVME slots. A bit strange. I wonder if the x870 version has been discontinued because it is never available.
 
Last edited:
but the motherboard pcie slot is only rated for 75w and actually in testing they pull much less than that as GPU's are configured to pull all of their power over their own 8pin pcie connectors (or 12vhpwr as they are now switching to). GPU'S capable of using more power have their own extra connectors.

That depends on GPU and configuration/features - some still pull 50-60 watt over the PCI-e connector - I can't remember specific details now but some of the 2000 or 3000 series cards had a dedicated power circuit from the slot for some components - there was an issue with some drawing too much power from the PCI-e connector.

EDIT: My 4080 Super in comparison rarely draws more than 6 watt from the socket with the bulk coming from the 12VHPWR connector.
 
Last edited:
For the 5070ti watchers.


Considering that's around the time stock is expect to start appearing for other GPUs, I wonder if the 5070Ti might actually be a real (if not limited) launch and not just an immediate pre-order.
 
What’s interesting is that many x870e boards have pci lanes shared where yes it has a pci 5 x16 but as soon as you add a m.2, it halves that which is basically running it at pci gen 4 x16 speeds.

So PCI lane sharing is a none issue
That's why I went for an Asus B650E-E - get graphics and M2 slots at full PCIE5 speed, but the Sata drive port speeds are halved which is no big deal
 
The 4090 also has hdmi 2.1 but nvidias implementation of it doesnt allow that resolution at 240hz, while its possible on the hdmi 2.1 ports on the amd 7900xtx for example, not sure if its down to dsc but there was quite a lot of kickup about it at the time

Ah yeah that was it, it was nvidia vs samsung's interpretation of DSC.

On nvidia's site it says this:

5090
6eZkfqG.png


4090
1lK21bc.png
 
Whilst we're on the topic, shouldn't Nvidia be putting HDMI 2.2 on these £3000 graphics cards? They are obsolete before they're even released

DP 2.1 is what you should be using...

It would make sense if TVs and monitors already supported it, but they don't.
 
Last edited:
For the 5070ti watchers.


Cheers for that.

Cynically, do we think that review embargo on the same day as release suggests an expected poor review? Guess we will get an idea when we see the 5080 reviews...

There's no FE for the 5070ti and with the comments about Nvidia not providing a lot of profitability to sell at msrp then I'd guess 19th feb options/reviews will be limited?

All Nvidia's 5070ti comparisons I've seen are against the 4070ti (presumably the non super version). Given the 4070 TI Super is approx 10% more than non-super and Nvidia's non dlss4 charts of the 5070ti seems to be about 10-15% more than the 4070ti - do we just assume the 5070ti will be approx 4070TIS, and nowhere near the 4080?
 
Cheers for that.

Cynically, do we think that review embargo on the same day as release suggests an expected poor review? Guess we will get an idea when we see the 5080 reviews...

There's no FE for the 5070ti and with the comments about Nvidia not providing a lot of profitability to sell at msrp then I'd guess 19th feb options/reviews will be limited?

All Nvidia's 5070ti comparisons I've seen are against the 4070ti (presumably the non super version). Given the 4070 TI Super is approx 10% more than non-super and Nvidia's non dlss4 charts of the 5070ti seems to be about 10-15% more than the 4070ti - do we just assume the 5070ti will be approx 4070TIS, and nowhere near the 4080?
it's odd that there's a 5070ti msrp embargo date without an FE being made, which makes me wonder will there will be an aib model selling for msrp??
 
All Nvidia's 5070ti comparisons I've seen are against the 4070ti (presumably the non super version). Given the 4070 TI Super is approx 10% more than non-super and Nvidia's non dlss4 charts of the 5070ti seems to be about 10-15% more than the 4070ti - do we just assume the 5070ti will be approx 4070TIS, and nowhere near the 4080?
I expect it to be near 4080, given same 300W power budget. Unlikely to have a regression in perf/power
Enjoying a decent uplift in memory bandwidth and headroom to hit higher clocks on fewer SMs might translate to some surprises.
Should know for sure after 5080 reviews.
 
The 5090 GDDR7 Memory clock is 1750 MHz and the FE reviews mentioned it was quite toasty so perhaps the 5080 at 1875Mhz will already be at its limit. It looks like Nvidia have already overclocked the 5080 Memory to gain a bit of performance which is probably a bad sign for its expected performance, and would mean the 5070 Ti with its 1750Mhz memory clock could probably catch up bandwidth wise to the 5080 with an overclock.

Or do you think the 5080 uses better binned GDDR7 than the 5090/5070 Ti?
 
Last edited:
Back
Top Bottom