• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia now at 90% market share.

I wonder if we're paying too much attention to the gaming performance? And not enough attention to everything else? Nvidia GPUs also support things like RT Voice, Nvidia Broadcast, NVENC, CUDA, and more. I think the first two are especially important for the home worker. CUDA is a de facto standard. And NVENC is pretty good.
 
I just looked it up, TSMC N4 is 22% more power efficient than N5.

That's only a partial explanation. If you look at 4090 and how much it consumes full throttle vs 7800xt and then it goes below 7800xt in power consumption while limited to 1080p@60fps, surely there's some different "magic" going on there.
 
One thought, the 90%, is that of the PC market, excluding PC units with integrated graphics?
If so, then it ignores a lot of the business market, and also the console market. Do AMD still supply GPUs for the current range of consoles?
 
Any idea of how many consoles were sold last year and how many standalone GPUs? That would be an interesting comparison.
 
That's only a partial explanation. If you look at 4090 and how much it consumes full throttle vs 7800xt and then it goes below 7800xt in power consumption while limited to 1080p@60fps, surely there's some different "magic" going on there.

7800 XT: 92 watts
4090: 79 watts, a 16% difference.

7800 XT: 92 watts
4070 Ti Super: 77 watts, 19% difference.

7800 XT: 92 watts
4070 super: 68 watts, 35% difference.

There is probably quite a lot going on, the largest part i would say is the 22% difference between N4 and N5, the 4070 Super is missing one 64Bit IMC compared with the 7800 XT and two memory IC's, you can see that not just with the significant difference between the 7800 XT but also the 256Bit 16GB 4070 Ti Super.
It is also MCM, MCM does use a little more power than monolithic, for some reason AMD also have a tendency to overvolt their GPU stock, we have known this for many years, i was asked to test the power consumption and performance of my 7800 XT undervolted, the result was from 250 watts to 170 with a 2% performance gain.

There is nothing 'Magic' going on with Nvidia's GPU's, yes they are more efficient, most of that is down to the difference between N4 and N5 and the reast of it is easily explained.
Look at AMD's APU's, they have the same RDNA 3 GPU's, everyone agrees they are incredibly efficient, they are running AAA games at 60 Hz low settings 1080P with about 15 watts, the performance per watt is way more than Nvidia's discrete GPU's.

RDNA 4 should be on N4 and probably monolithic, they are a true comparison, wait to see that before disparaging AMD's GPU's, i think you would be dead wrong to say they aren't efficient.
Right now we are comparing Apples to Oranges.
 
Last edited:
7800 XT: 92 watts
4090: 79 watts, a 16% difference.

7800 XT: 92 watts
4070 Ti Super: 77 watts, 19% difference.

7800 XT: 92 watts
4070 super: 68 watts, 35% difference.

There is probably quite a lot going on, the largest part i would say is the 22% difference between N4 and N5, the 4070 Super is missing one 64Bit IMC compared with the 7800 XT and two memory IC's, you can see that not just with the significant difference between the 7800 XT but also the 256Bit 16GB 4070 Ti Super.
It is also MCM, MCM does use a little more power than monolithic, for some reason AMD also have a tendency to overvolt their GPU stock, we have known this for many years, i was asked to test the power consumption and performance of my 7800 XT undervolted, the result was from 250 watts to 170 with a 2% performance gain.

There is nothing 'Magic' going on with Nvidia's GPU's, yes they are more efficient, most of that is down to the difference between N4 and N5 and the reast of it is easily explained.
Look at AMD's APU's, they have the same RDNA 3 GPU's, everyone agrees they are incredibly efficient, they are running AAA games at 60 Hz low settings 1080P with about 15 watts, the performance per watt is way more than Nvidia's discrete GPU's.

RDNA 4 should be on N4 and probably monolithic, they are a true comparison, wait to see that before disparaging AMD's GPU's, i think you would be dead wrong to say they aren't efficient.
Right now we are comparing Apples to Oranges.

4070ti super it takes 304w regular gaming and falls behind in low power requirement. so from 22% more power full vs 19% more power for AMD required.

4090 vs 7800xt gaming: 411w vs 250w
4090 vs 7800xt RT: 451w vs 250

4090 vs 7800xt gaming 1080p@60fps: 79w vs 92w

So it goes from 1,64x more power for 4090 (or even 80% in RT), to 16% more power for 7800xt. nVIDIA can save a huge amount of power that way...

4080 vs 7900xtx gaming: 304w vs 360w (the processes differences you'd probably see them here).
4080 vs 7900xtx gaming 1080p@60fps: 68w vs 113w

18% difference full (more power for AMD explained by process nodes perhaps) vs 66% difference in lower load (AMD needing more than the process nodes).

Point is... nVIDIA can save more power than AMD in lighter loads.

One place where AMD seems to have and edge (very small at a few W) is idle.

As for over volting AMD cards, you can also save power on nVIDIA.

 
Last edited:
@Calin Banc instead of us just going round in circles let me instead ask why is this so important to you? No one prompted it, no one was trying to flex the power efficiency of one or the other, at least not before you did, because no one cares, other than apparently you, why?

I'm struggling to understand the point you're making and why. If i did perhaps we could go from there.
 
Last edited:
@Calin Banc instead of us just going round in circles let me instead ask why is this so important to you? No one prompted it, no one was trying to flex the power efficiency of one or the other, at least not before you did, because no one cares, other than apparently you, why?

I'm struggling to understand the point you're making and why. If i did perhaps we could go from there.
Well, go back from where it started:


and on my post you've said


Basically you could say is just another minus for AMD amongst others. How much that or others matter to you is another point entirely. But all together matter for the average Joe or else nVIDIA would not have 90%.

For me is important as I don't want to heat up the room for nothing, spend money of energy and put heavier stress than required on the PSU. If that doesn't matter, why would it matter if Intel then is more power hungry than AMD?
 
Last edited:
@GoogalyMoogaly meant nothing by it even stating its probably nothing, which is why no one responded to it.
What Googlaly is saying it pretty straightforward, play a game and a 14900K will pull about 200 watts to the Ryzen 7800X3D's 60 watts, that's not some abstract nonsense about if X and Y are true then this, people do care about differences like that, they don't care about the abstract to get small differences in a way they don't even use the thing, for one you can easily run the Ryzen CPU on this £30 cooler.... for the 14900K you need a £100 cooler minimum.

The 7800 XT and 4070 Super are similar in performance and the 7800 XT uses 30 watts more power... its not worth getting excited over so no one cares.
 
Last edited:
And then you have 297w vs 512w or 292w vs 403w or 379w vs 512w or 419w vs 513w on the 4080 vs 7900xtx in that example, which is important to me since I was going between a 4080 and 7900xtx around launch time back then... that is no 30 watts.
 
Last edited:
And then you have 297w vs 512w or 292w vs 403w or 379w vs 512w or 419w vs 513w on the 4080 vs 7900xtx in that example, which is important to me since I was going between a 4080 and 7900xtx around launch time back then... that is no 30 watts.

This crap only matters to people who have already decided they don't want an AMD GPU, they are saying the 7900 XTX uses 220 watts more than the 4080, this is abstract, lets say the rest of the system is 100 watts, how did they get the 300 watt 4080 to use only 200 and the 360 watt 7900 XTX to use near 400? Anyone with any creases in their brain will see that and immediately file it under obvious garbage.

People just don't take this stuff seriously.
 
Last edited:
This crap only matters to people who have already decided they don't want an AMD GPU, they are saying the 7900 XTX uses 220 watts more than the 4080, this is abstract, lets say the rest of the system is 100 watts, how did they get the 300 watt 4080 to use only 200 and the 360 watt 7900 XTX to use near 400? Anyone with any creases in their brain will see that and immediately file it under obvious garbage.

People just don't take this stuff seriously.
Look at the video, it gets worse for the cards alone. Measuring just the GPU with PCAP at CS GO 1440 low, for example, you have 65w for 4080 and 213w for 7900xtx for similar performance (4080 is a bit ahead, but doesn't really matter). 5.47 fps/w vs 1.64 fps/w for AMD. You can call it driver overhead (if that's what it truly is), since nVIDIA was loaded 32% vs 67% AMD, but then again, if you throw at it a more performant CPU, it would just destroy AMD in that scenario performance wise.

In that CS GO example fans were off the entire time on 4080, so passive, silent gaming. Not the same on AMD.

Then coil wine is where AMD "wins" as well.

OW2: 218w 4080 vs 383w AMD. Or AMD pulling in the menu the same that 4080 pulls in game.

And if you say "how come AMD pulls 383w?". Well, it depends per model, it can go to 400w apparently.


power-gaming.png
 
Last edited:
Look at the video, it gets worse for the cards alone. Measuring just the GPU with PCAP at CS GO 1440 low, for example, you have 65w for 4080 and 213w for 7900xtx for similar performance (4080 is a bit ahead, but doesn't really matter). 5.47 fps/w vs 1.64 fps/w for AMD. You can call it driver overhead (if that's what it truly is), since nVIDIA was loaded 32% vs 67% AMD, but then again, if you throw at it a more performant CPU, it would just destroy AMD in that scenario performance wise.

In that CS GO example fans were off the entire time on 4080, so passive, silent gaming. Not the same on AMD.

Then coil wine is where AMD "wins" as well.

OW2: 218w 4080 vs 383w AMD. Or AMD pulling in the menu the same that 4080 pulls in game.

And if you say "how come AMD pulls 383w?". Well, it depends per model, it can go to 400w apparently.


power-gaming.png

You're just doubling down on this abstract nonsense. Only you care.
 
Last edited:
Intel's power issues were real because 13th and 14th gen had a voltage bug.
Nvidia's power issues are real because 12vhpwr connectors are melting.
Radeon don't have a problem.
Yeah, don't know why they bother with the power consumption test, I mean those differences in relative light games like AW2 of 50w vs ~130w for Intel isn't relevant - or 80w vs 200w. Heck, even that 200w will keep you warm during winter! :D

The melting connectors were prone to happen due to improper installment. AMD's hot spot bug was due to improper testing / QA testing in an actual case - since apparently it wasn't an issue if the card was installed vertically.

I run a 4080 with an adapter from a 525w PSU with no issues alongside a 5800x3d. A close to 200w Intel CPU + 7900xtx bordering 400w would have not been possible - not to mention heating up the room + noise to cool it. I know, no one cares and yet, 90% do to some extent. :)

power-games-compare-vs-14900k.png
 
Last edited:
Back
Top Bottom