• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

970's having performance issues using 4GB Vram - Nvidia investigating

Status
Not open for further replies.
Yeah it's like anything using more power than a 980 is suddenly an unusable card, but its only ever the 290 this gets applied too, the fact that the 780 series uses about the same power gets ignored.

Indeed.
90W/h more the 290X burns compared to stock 980 at 100% load.

The difference is £168 between the cheapest models. Albeit the GTX980 i used as reference is overclocked and draws more power than stock. But I will ignore that.

£168 = 1866kwH in UK. (avg price is £0.09 per kwH)

1866000 W/h div 90 Wh equal approximately 20,700 HOURS!!!!

So someone needs to burn his GTX980 at 100% load, 20700 hours MORE than if he had a 290X, to break even on the premium paid, compared to the 290X.

And to put into perspective, at 50 hours per week gaming, that is almost 8 years.

So I wonder, those who say about the power consumption gains, have they ever done any maths at school? Or they plan to keep their cards until 2023?
:rolleyes:
 
Indeed.
90W/h more the 290X burns compared to stock 980 at 100% load.

The difference is £168 between the cheapest models. Albeit the GTX980 i used as reference is overclocked and draws more power than stock. But I will ignore that.

£168 = 1866kwH in UK. (avg price is £0.09 per kwH)

1866000 W/h div 90 Wh equal approximately 20,700 HOURS!!!!

So someone needs to burn his GTX980 at 100% load, 20700 hours MORE than if he had a 290X, to break even on the premium paid, compared to the 290X.

And to put into perspective, at 50 hours per week gaming, that is almost 8 years.

So I wonder, those who say about the power consumption gains, have they ever done any maths at school? Or they plan to keep their cards until 2023?
:rolleyes:

Yeah, this is all wrong.
 
Why are people talking about the cost of running a 290x over a 970? It has never been about the cost of the additional power consumption, ever. That's an excuse dreamed up by people to dismiss the power consumption argument.

Its about heat output and not running temperature, either, heat output. Doesn't (shouldn't) matter to most people, however for some, because of space issues etc., it makes the difference between choosing card A or card B.

Lower power consumption is something everybody should want. Not because it costs less in juice to run, but because less consumed means more room for something bigger and something faster. Ie, if nvidia hadn't have achieved the efficiency improvements with Maxwell, how much further do you think they could have pushed Kepler?
 
Why are people talking about the cost of running a 290x over a 970? It has never been about the cost of the additional power consumption, ever. That's an excuse dreamed up by people to dismiss the power consumption argument

Well what did people do before nVidia miraculously knocked a few dozen watts off the power consumption of their cards? Was there some sort of GPU dark age where people were unable to SLI/Crossfire without an enormous case lined with fans and a dedicated air conditioning unit? No, people have managed fine up till now, even *gasp* on 780tis that used up quite a lot of power as well!

Why is it suddenly so enabling now that nVidia's cards are a bit more efficient? Answer: it isnt.
 
Why are people talking about the cost of running a 290x over a 970? It has never been about the cost of the additional power consumption, ever. That's an excuse dreamed up by people to dismiss the power consumption argument.

Its about heat output and not running temperature, either, heat output. Doesn't (shouldn't) matter to most people, however for some, because of space issues etc., it makes the difference between choosing card A or card B.

Lower power consumption is something everybody should want. Not because it costs less in juice to run, but because less consumed means more room for something bigger and something faster. Ie, if nvidia hadn't have achieved the efficiency improvements with Maxwell, how much further do you think they could have pushed Kepler?

spoken like someone who truly supports the green team
 
An average of "£0.09 per kwH" is just pure nonsense.

I pay 8.9p (£0.089) per kwH.

There are tarifs at 8.3p and there are tariffs at 11p.

It all depends how well you search and what sort of deal you can get.
If you pay more than 11p, look to change your provider. They rob you.

So even at 11p per kwH
£168 = 1530 kwh approx.

1530kwH / 0.90kwh or 1530000wh / 90wh take it as you like = 17000 hours

So you need to use the GTX980 17000 hours more than a 290X to break even
the premium paid over the 290X.

17000, with avg 50 hours gaming a week at 100% load (constantly) equals 6.5 years.
Even myself cannot get 50 hours a week at 100% load, and I use my PC around 70 hours a week.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom