• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia now at 90% market share.

Amd and 4090 400w?

3090 Ti:Hold my beer.

In summer the heat was unbearable. I am not a Summer/hot person so you can imagine with a 12900k and 13900k each driving a 3090ti each. 12900k was not so bad by itself but the 3090ti I can’t remember if the suprim x 3090ti had a 500w limit but the ASU’s strix 3090ti had a whopping 600w limit.

The 4090 strix has a 600w limit but you never get even remotely close to it. But the 3090ti version there were many times you easily hit 520w and when you overclocked it…it was like the infinity gauntlet but for all the wrong reasons. Insane power draw, and ironically the most insane coil whine you have ever heard. I have not heard any card from AMD or NVIDIA from any other brand or specific model which had the coil whine that this had. I have posted in the past how the strix 4090’s and suprim X had bad coil whine for a flagship card. But as bad as they were. It wasn’t even close. In the end I had to RMA it and when the retailer phoned me to ask why I was returning it I asked them did they test it and they said yes and it worked fine. I asked the technician if they ever heard coil whine like that they said never and my RMA was approved.

Tbh the card worked fine and ran cool. But that coil whine is unlike any I have heard before or since that particular card.
 
Last edited:
I run a 4080 with an adapter from a 525w PSU with no issues alongside a 5800x3d. A close to 200w Intel CPU + 7900xtx bordering 400w would have not been possible - not to mention heating up the room + noise to cool it. I know, no one cares and yet, 90% do to some extent. :)
I snipped the post to save space, people argued to death in the CPU/GPU subforum over this so people care and as you pointed out theres multiple areas to consider with power usage. Me trying to decide on a 3060ti vs a 6700XT a year ago although the power consumption was close the power spikes of the 6700XT was nearly 330w vs the 3060ti's 220w and the price sealed the deal as I got the 3060ti used for £190 vs the 6700XT's £250+ used.

Upgrade wise now I could manage with a 4070 or a 4070 Super at a push with my 500w PSU any AMD upgrade that may be cheeper at the moment needs a near £100 new power supply to go with it which is why I'm seeing what RDNA4/Blackwell options surface though the rumours seem Blackwell is turning up the power dial up so will wait and see.

To bring this back on topic I think this is why nvidia market share is where it is as there's nearly always been multiple upgrade options for me with them and apart from the rx6600 I had AMD have not been much of an viable option for a couple of generations now without having to upgrade or consider upgrading other parts of my computer at the same time.
 
Pages of power consumption arguments again...

Yet last generation - when Nvidia went for the cheaper Samsung process for their consumer cards and hence had worse perf/watt - power consumption was barely mentioned?

In other words when AMD had better power consumption it was ignore. And the Nvidia 8GB and 10GB cards vastly out-sold AMD's 12GB and 16GB cards despite worse power efficiency and having less VRAM -often laughable so.

Yes in the UK AMD's supply of RDNA2 was poor but there is more going on.
 
Pages of power consumption arguments again...

Yet last generation - when Nvidia went for the cheaper Samsung process for their consumer cards and hence had worse perf/watt - power consumption was barely mentioned?

In other words when AMD had better power consumption it was ignore. And the Nvidia 8GB and 10GB cards vastly out-sold AMD's 12GB and 16GB cards despite worse power efficiency and having less VRAM -often laughable so.

Yes in the UK AMD's supply of RDNA2 was poor but there is more going on.
Since it was the crypto boom, they've probably sold everything they could make anyway.
 
Pages of power consumption arguments again...

Yet last generation - when Nvidia went for the cheaper Samsung process for their consumer cards and hence had worse perf/watt - power consumption was barely mentioned?

In other words when AMD had better power consumption it was ignore. And the Nvidia 8GB and 10GB cards vastly out-sold AMD's 12GB and 16GB cards despite worse power efficiency and having less VRAM -often laughable so.

Yes in the UK AMD's supply of RDNA2 was poor but there is more going on.
In all fairness the power consumption difference between RDNA 2 and Ampere was quite a bit smaller than RDNA 3 and Ada.

Plus the energy pricing crisis hadn't hit us yet back then.
 
Last edited:
People only seem to care when its AMD that has the higher consumption when its Nvidia suddenly theres a deathly silence which is why I regard all these arguments as nonsense, endless whataboutism.
Plus I guess most people undervolt as well so book figures don't really mean anything at the end of the day compared to how people might use them. I guess it's nice to aim and achieve efficiency though. Case in point, my GPU was reviewed hoovering up nearly 400w. Recommend PSU is 1kW. I run it undervolted and it now sips 200w. :)
 
Plus I guess most people undervolt as well so book figures don't really mean anything at the end of the day compared to how people might use them. I guess it's nice to aim and achieve efficiency though. Case in point, my GPU was reviewed hoovering up nearly 400w. Recommend PSU is 1kW. I run it undervolted and it now sips 200w. :)
Not just undervolting but also downclocking with next to no performance regressing. Now undervolting has benefits on both nvidia and radeon gpus but I'm not sure how nvidia gpus react to downclocking the core. RDNA2 and 3 however are ffing loving it though. Once you know this then you can get some pretty amazing results even if your GPU is a dud like mine when it comes to undervolting.
 
Wonder what would happen if AMD just stopped bothering with the PC GPU market entirely. Would Nvidia then jack their already ridiculous prices up even higher? And wouldn't that make PC gaming even more of a niche hobby for rich people than it already is?
 
Really? Given all the people that buy pre-builts and laptops?
Personally I'd guess outside of enthusiasts (which I imagine make up a small percentage of PC owners) most people don't tweak anything.
99.99999% don't touch a thing and that includes bios updates
 
Last edited:
Wonder what would happen if AMD just stopped bothering with the PC GPU market entirely. Would Nvidia then jack their already ridiculous prices up even higher? And wouldn't that make PC gaming even more of a niche hobby for rich people than it already is?
The question is, as it stands do Nvidia base their prices on anything AMD do?
As it stand Nvidia release first and then AMD do a paper launch where they knock £1 off the Nvidia prices. And then in a few months time when AMD drops their prices do Nvidia respond? The 4080 SUPER was probably the closest we got to that, but was that in response to AMD's prices or poor sales?

But you never know, maybe they're enough to stop Nvidia doing really high prices. But then Nvidia has to know that even without a rival consumers wouldn't go for silly prices in any reasonable numbers. Still, maybe that would allow them to dedicate more chips to their business orientated stuff.
 
Back
Top Bottom