• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How far could and should power consumption go?

Does MPG matter for the best performing cars? No. It's the performance that is the priority.
 
Whilst that is true, lets say for example the worst guzzler so far is a 3090Ti with a jacked up build gaming would at the wall be consuming say 600w + 100w for display; your looking at around 75p a four hour session. This was only costing you maybe 30p last year.

Enough for most normal folk to sit up and take note anyhow.

Well, I'll repeat myself, if normal folk would have disposable income to buy 1k-2k + card, then the cost of electrical power isn't much of an issue. After the rise in prices of the electricity I think I can still game for about 4hrs for the price of one coffee in the town.

Kaap's monitor has been a 60hz model for many years (unless he's upgraded recently?), so not using full TDP of the cards in gaming, unless vsync/gsync off and he's getting tearing galore :p

Think he used to do it mostly for benchmarks and just for fun to play with them, which I can fully appreciate :)

Then you won't use the full power of these cards as well, ergo lower power consumption.

Indeed I was and would do so again if the conditions were right.

The cards I ran in 4 way SLI and Crossfire were around the 300W TDP. What this meant is they were giving near maximum performance with a reasonable cooling solution and they did not cost ridiculous sums of money. When run in mGPU setups they gave a lot of performance at a reasonable cost.

Unfortunately Nvidia have got themselves into a situation of diminishing returns by building cards which need huge TDPs, using huge dies and needing ridiculous triple slot coolers.

Nvidia's upcoming cards are a bit like doing your shopping in a Tank, you will get the job done, you will be pretty safe from accidents (unless you are passing through Ukraine) but the petrol consumption will be awful. I think most people would rather use something more appropriate like a small car.

We need to get back to seeing cards rated at no more than 300W TDP which give 85% - 90% of the performance that these upcoming cards will give.

I believe you had Titans? Those were at least $4000 and together were running around 1000W TDP, give or take. Plus 2 slots / card means 8 slots.

IF these new cards are MCM, then 4-500w is somewhat normal. After all, cards like 295x2 and even R390X were power hungry!

We don't know all the details yet, but if the power draw of the ultra high end bothers you, then just buy the cards that fits for your needs. I'm firstly bother by their possible price! :))

People pay 20k+ for a new car but doesn't mean they want poor fuel efficiency.

20k for a car is what... low... midrange for them? Buying the best of the best from BMW, Audi, Dodge, Ford, whatever (that's over 100K) and then complain about fuel consumption... :)
 
20k for a car is what... low... midrange for them? Buying the best of the best from BMW, Audi, Dodge, Ford, whatever (that's over 100K) and then complain about fuel consumption... :)

I take it you never read about the VW emissions scandal? ;)
 
A larger power budget just means they have to invest less in innovating RnD and instead just brute force performance and they dont not care about how much electric these use because they are not paying your utility bills.

Think of all the cash nvidia saved going with Samsung this time around instead of the more efficient but pricier TSMC yet these savings weren't passed onto customers and instead just increased margins.
 
A larger power budget just means they have to invest less in innovating RnD and instead just brute force performance and they dont not care about how much electric these use because they are not paying your utility bills.

Think of all the cash nvidia saved going with Samsung this time around instead of the more efficient but pricier TSMC yet these savings weren't passed onto customers and instead just increased margins.

R&D is drastically increased inceased the last few generations, due to the enormous complexity of smaller processes and the sheer number of transistors, that's to name just a few. The architecture design is now extremely advanced, many more thousands of hours go into each generation compared to even 5 years ago.
 
A larger power budget just means they have to invest less in innovating RnD and instead just brute force performance and they dont not care about how much electric these use because they are not paying your utility bills.

Think of all the cash nvidia saved going with Samsung this time around instead of the more efficient but pricier TSMC yet these savings weren't passed onto customers and instead just increased margins.

While true, it also meant more availability since all the cards are not manufactured at the same factory.
 
That would have been true if they didn't sell half of them to the mining farms.
It is true. Some of the people managed to get FE cards or scalped ones. Probabily would have gotten a lot less or prices would have been even higher.

Of course, the whole selling to miners meant quick profits then with a chance of a dip if mining goes under.

Bottom line: people would have still bought 3xxx series even with current power draw IF the prices would have been normal (low).
 
Back
Top Bottom