• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

The power draw figures are almost certainly the max allowed by the BIOS, like the 4090, which means the dual slot cooler design holds more merit because the actual draw will be far less, just like the 4090,maybe even less so due to the updated process and design. The main thing to cool will be the VRAM I suspect whilst the core is very efficient.

As the comments state and as we all know, you can drop a 4090's power limit by up to 30% in MSIAB and barely lose even 5% performance. Undervolting and OCing is the best way to go about it of course for a power drop but framerate increase.
 
amd needs to innovate harder, nvidia is holding back i guess, a downgraded cuda count, if true, is the biggest dampener leading up to launch

Why does need to be blamed entirely on AMD? AMD primarily has always been a CPU company which makes dGPUs on the side. Their fortunes always became better when the CPU did well which is their focus. AMD or Intel are not going to save pricing - its clear their priorities are elsewhere now.

In the end Nvidia also is selling to a lot of it's own customers who don't care about AMD.

You had the same with people making excuses for AMD pricing Zen3 and Zen4 pricing being high,because of Intel not competing. Apple fans keep making excuses about Samsung,etc about why their latest iProduct is this or that. No,it was 100% on AMD or Apple thinking their customers were mugs.

People have forgotten,Nvidia launched one of its greatest mainstream cards,the 8800GT,nearly a year after ATI failed miserably with the 2900XT. If Nvidia or AMD or Intel chooses to milk their customers,that is on them.

They can chose not to. Like people can chose to buy or not buy at the price.

Nobody "needs" to have their games at 4K max settings every generation. The fact that gamers can't help themselves is why every generation all these companies try a fast one.
 
Last edited:
amd needs to innovate harder, nvidia is holding back i guess, a downgraded cuda count, if true, is the biggest dampener leading up to launch
The weird thing is... the direction Nvidia are allegedly taking with the 5090 is absolutely not what I would have expected to happen when AMD stopped competing at the top end. I'd always understood that the ridiculous power draw on some recent GPUs and CPUs, which often yields very little performance gain (my 4090 is capped at 85% and barely loses any performance as a result) was about pushing those last few frames for the review benchmarks. Given the 5090 probably won't be subject to any meaningful benchmark comparisons against AMD cards (at least until closer to the launch of the 6000-series), I'm actually surprised that they'd go so far in that direction.
 
A 2 slot 32gb card would be incredible for 3D artists like myself (as long as it doesn't sound like a jet engine), given what a comparative faff it is to run 2 x 4090s. Shame about the linear power/cuda core increase but it might still be worth it for the form factor and vram alone.
 
The power draw figures are almost certainly the max allowed by the BIOS, like the 4090, which means the dual slot cooler design holds more merit because the actual draw will be far less, just like the 4090,maybe even less so due to the updated process and design. The main thing to cool will be the VRAM I suspect whilst the core is very efficient.

As the comments state and as we all know, you can drop a 4090's power limit by up to 30% in MSIAB and barely lose even 5% performance. Undervolting and OCing is the best way to go about it of course for a power drop but framerate increase.

4090 is 450w and 3-4 slots

You try to say 600w will actually use like 250w, which it would need to be to justify 2 slot

Also, hate to break it to you but GPU coolers are built to deal with the worst case, not average - so if a GPU can pull 600w, it needs to cool 600w otherwise you get tons and tons of warranty claims

Undervolting is irrelevant, everyone knows Nvidia and amd overvolt their cards, they will never undervolt. So again if it's a 600w TDP the card can and will in certain cases close in on that number
 
Last edited:
The thing is, the xx60 class cards are now absolutely terrible value for money considering just how gimped they are. The disparity between the top end and the bottom end has never been greater, it's a strategy because it pushes people who probably shouldn't be spending £1.5-2k on a GPU towards one so they can tell themselves they're actually getting more for their money.
That's correct and that is also why AMD claim they will dominate that part of their market when much better value products in the future and why they don't want to aim at high end. We'll see. It's still the price bracket that huge majority aim at (as per steam stats and retailers and per AMD words).

I personally know people who bought 4090s who are/were on minimum wage or close to it and I know they were already in debt. nVidia is capitalising on the haves and have-nots divide.
Yeah, often luxury products are purchased by people who can't really afford it at all - it's not just the case with GPUs but everything.
 
4090 is 450w and 3-4 slots

You try to say 600w will actually use like 250w, which it would need to be to justify 2 slot

Also, hate to break it to you but GPU coolers are built to deal with the worst case, not average - so if a GPU can pull 600w, it needs to cool 600w otherwise you get tons and tons of warranty claims

That is assuming the 600W rumour is true - might end up being a load of nonsense too.

That's correct and that is also why AMD claim they will dominate that part of their market when much better value products in the future and why they don't want to aim at high end. We'll see. It's still the price bracket that huge majority aim at (as per steam stats and retailers and per AMD words).

Problem is AMD is capacity limited too,which is another issues with the same capacity vying for CPUs,consoles and their commercial dGPUs too. I really hope they can actually try and explore avenues for second sourcing.
 
Last edited:
That is assuming the 600W rumour is true - might end up being a load of nonsense too.
The 600W haven't been clarified what that actually is about. It could be just max power limit on bios like on 4090. However, more transistors, same process pretty much, it has to use more power - we aren't going to break physics here.
 
The 600W haven't been clarified what that actually is about. It could be just max power limit on bios like on 4090. However, more transistors, same process pretty much, it has to use more power - we aren't going to break physics here.

The GDDR7 memory subsystem should in theory use less power,unless OFC GDDR7 has issues we are unaware off.
 
Last edited:
given these leaks im REALLY tempted to pull the trigger on buying a 4080super. getting more than a 50% performance increase for the same power draw seems like a bargin right now... I dont know a lot about the subject but it feels like compaines are beginning to reach the limits of what they can draw out of the current generation of GPU's without just demanidng more power draw for GPU/CPU to get high performance. maybe in the 6000 generation? lmao
 
Back
Top Bottom