• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

970's having performance issues using 4GB Vram - Nvidia investigating

Status
Not open for further replies.
Well what did people do before nVidia miraculously knocked a few dozen watts off the power consumption of their cards? Was there some sort of GPU dark age where people were unable to SLI/Crossfire without an enormous case lined with fans and a dedicated air conditioning unit? No, people have managed fine up till now, even *gasp* on 780tis that used up quite a lot of power as well!

Why is it suddenly so enabling now that nVidia's cards are a bit more efficient? Answer: it isnt.

Gpus were never this big before so the answer is they didn't have to do much because it's its never been such an issue.
 
Gpus were never this big before so the answer is they didn't have to do much because it's its never been such an issue.

Do you realize that ppl throw around this magic low consumption tdp and ignore the fact that that tdp number floating around is for a reference card that Nvidia never really released? Take the G1 970 it is iirc a 220w card. Under gaming loads it's roughly 14% lower power consumption than a 290x.

In other words you're out there in the real world arguing with another blown up falsehood. And on the matter of power draw/consumption and power limits, no one actually wants a card that consumes lower power because the first thing ppl do with NV is flash it to get a higher power limit than stock. It's comical the levels of ironing here.


For ex:

http://www.legionhardware.com/articles_pages/gigabyte_g1_gaming_gtx_970_gtx_980,9.html

And look at the temps the 290s reach. Yeap, 95c ie. heavy ass throttling is occurring. It's nice that they compare top of the line custom cards versus the bottom of the barrel "cooler" wise amd cards. It's a nice and fair fight. And given that lower temps equate to lower consumption, it's not that great a feat.
 
Do you realize that ppl throw around this magic low consumption tdp and ignore the fact that that tdp number floating around is for a reference card that Nvidia never really released? Take the G1 970 it is iirc a 220w card. Under gaming loads it's roughly 14% lower power consumption than a 290x.

In other words you're out there in the real world arguing with another blown up falsehood. And on the matter of power draw/consumption and power limits, no one actually wants a card that consumes lower power because the first thing ppl do with NV is flash it to get a higher power limit than stock. It's comical the levels of ironing here.


For ex:

http://www.legionhardware.com/articles_pages/gigabyte_g1_gaming_gtx_970_gtx_980,9.html

And look at the temps the 290s reach. Yeap, 95c ie. heavy ass throttling is occurring. It's nice that they compare top of the line custom cards versus the bottom of the barrel "cooler" wise amd cards. It's a nice and fair fight. And given that lower temps equate to lower consumption, it's not that great a feat.
the 970 & 980 they tested in that link were overclocked Gigabyte G1 Gaming GTX 970 & 980 versions which would make the power consumption higher ;)
 
Last edited:
Why are people talking about the cost of running a 290x over a 970? It has never been about the cost of the additional power consumption, ever. That's an excuse dreamed up by people to dismiss the power consumption argument.

Its about heat output and not running temperature, either, heat output. Doesn't (shouldn't) matter to most people, however for some, because of space issues etc., it makes the difference between choosing card A or card B.

Lower power consumption is something everybody should want. Not because it costs less in juice to run, but because less consumed means more room for something bigger and something faster. Ie, if nvidia hadn't have achieved the efficiency improvements with Maxwell, how much further do you think they could have pushed Kepler?

Quite impressive deluded drivel you spouted there my man.

Quite impressive.
 
Whats really funny is my 290X although well known for running hot under load exhausts outside of the case which has brought my internal case temps down and given me more overclocking headroom for my CPU on the same cooler
 
Quite impressive deluded drivel you spouted there my man.

Quite impressive.

how is it deluded:confused:

the cost difference between running a 290x and a 970 is naff all. virtuall nothing. [Not drivel]

That difference can, if you are running a small case or have limited cooling etc, be the decider between card A and card B for some people. [Not Drivel].

For the rest of us, it doesnt make much of a difference at all. Like me, for example, when i made a poll here choosing between the 290x lightning and the 970 infinity black. [Not Drivel]


So, go on. You're a clever chap, or so you seem to think, so go ahead and explain the drivel to us.

Do you realize that ppl throw around this magic low consumption tdp and ignore the fact that that tdp number floating around is for a reference card that Nvidia never really released? Take the G1 970 it is iirc a 220w card. Under gaming loads it's roughly 14% lower power consumption than a 290x.

I'm not considering TDP because AMD and nVidia measure them differently. I'm only talking about the power consumption.

In other words you're out there in the real world arguing with another blown up falsehood. And on the matter of power draw/consumption and power limits, no one actually wants a card that consumes lower power because the first thing ppl do with NV is flash it to get a higher power limit than stock. It's comical the levels of ironing here.

My entire pc, at the wall, with a 970 running at 1500/2000, 125% power (as far as it will go) will pull around 290-300w in games. any game, some more than others but ive not seen more than 300 yet.

AMDMatt's monter, with 4x 8gb 290x''s, over 400w a card. A 1/3rd more than my entire pc at the wall just for one card. That's an estimated figure but given his pc was pulling 2,200watts, it's somewhere around there when you work out the losses etc. This was on air too i believe.

So yes, power consumption does go up when over clocking. But it's not like it only increases on nVidia cards - 290x's are massively power hungry when overclocked. Again, do i care? nope, but it's amazing how people are calling me fanboy, green team supporter and all that other nonsense when i'm only point out the facts. What's up with that? don't you lot like it or something?

Whats really funny is my 290X although well known for running hot under load exhausts outside of the case which has brought my internal case temps down and given me more overclocking headroom for my CPU on the same cooler

same way i shoehorned an 8800gtx in to a htpc :)
 
Last edited:
I pay 8.9p (£0.089) per kwH.

There are tarifs at 8.3p and there are tariffs at 11p.

It all depends how well you search and what sort of deal you can get.
If you pay more than 11p, look to change your provider. They rob you.

So even at 11p per kwH
£168 = 1530 kwh approx.

1530kwH / 0.90kwh or 1530000wh / 90wh take it as you like = 17000 hours

So you need to use the GTX980 17000 hours more than a 290X to break even
the premium paid over the 290X.

17000, with avg 50 hours gaming a week at 100% load (constantly) equals 6.5 years.
Even myself cannot get 50 hours a week at 100% load, and I use my PC around 70 hours a week.

You have not allowed for losses in your calculations, pf correction etc.
 
We need a poll for a thread title change,

From:

'970's having performance issues using 4GB Vram - Nvidia investigating'

To:

'970's having performance issues using 4GB Vram - Nvidia in denial'
 
how is it deluded:confused:

the cost difference between running a 290x and a 970 is naff all. virtuall nothing. [Not drivel]

That difference can, if you are running a small case or have limited cooling etc, be the decider between card A and card B for some people. [Not Drivel].

For the rest of us, it doesnt make much of a difference at all. Like me, for example, when i made a poll here choosing between the 290x lightning and the 970 infinity black. [Not Drivel]


So, go on. You're a clever chap, or so you seem to think, so go ahead and explain the drivel to us.



I'm not considering TDP because AMD and nVidia measure them differently. I'm only talking about the power consumption.



My entire pc, at the wall, with a 970 running at 1500/2000, 125% power (as far as it will go) will pull around 290-300w in games. any game, some more than others but ive not seen more than 300 yet.

AMDMatt's monter, with 4x 8gb 290x''s, over 400w a card. A 1/3rd more than my entire pc at the wall just for one card. That's an estimated figure but given his pc was pulling 2,200watts, it's somewhere around there when you work out the losses etc. This was on air too i believe.

So yes, power consumption does go up when over clocking. But it's not like it only increases on nVidia cards - 290x's are massively power hungry when overclocked. Again, do i care? nope, but it's amazing how people are calling me fanboy, green team supporter and all that other nonsense when i'm only point out the facts. What's up with that? don't you lot like it or something?



same way i shoehorned an 8800gtx in to a htpc :)

James, Did people ever post videos/clips of the problems the Vram was causing in game. I stopped reading this thread a while ago when you were asking if people could. If they have I will trawl back through!
 
We need a poll for a thread title change,

From:

'970's having performance issues using 4GB Vram - Nvidia investigating'

To:

'970's having performance issues using 4GB Vram - Nvidia in denial'

The second one :D

I believe they should cut the access (drivers?) to the 0.5GB VRAM and let everyone play the card with 3.5. At least there aren't going to be issue.

However, if I had one, I would be trying to return it back as faulty and miss-advertised product.
 
im sure i read cutting access to the 500mb would make things worse, dunno meself all the techy talk goes over my head, but i feel the 970 is what it is and people need to get over it
 
im sure i read cutting access to the 500mb would make things worse, dunno meself all the techy talk goes over my head, but i feel the 970 is what it is and people need to get over it

I think if nvidea divulged the info to start with there would be no issue. Its the fact they kept it a secret, deceived everyone into buying there product and then just kept quiet when the truth came out that people can't get over easily.
 
Most people just want rid because of the negative vibe/stigma surrounding it now, their epeen is only about half the size as it was before even though performance is the same.
 
Status
Not open for further replies.
Back
Top Bottom