• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gtx 490 Dual card might be coming

375W, lol, the 470gtx uses 250-270W in REAL wattage rating, and 215-225 in their make believe world of, well, rubbish.

Which puts 2 full 470gtx's at, well their own ratings 430W card, with a few savings on power circuitry and a couple tweaks, maybe 400-410W, but thats THEIR ratings, real ratings its a 500W card that they might be able to get down to 450W.

While their own make believe power ratings are only a number, the real power a card like that would use is beyond nuts, well beyond the current hard to keep quiet 480gtx.

The problem is, if its massively underclocked to drop, well 150W real, whats the point, why not just use cheaper cut down gpu's and keep a faster 470gtx on the market.


Seeing as that site that I've never seen linked to any "new" news ever before has come up with two supposed HUGE scoops on the same day......... I call BS.
 
Real power consumption of a GTX470 flat out - which never happens in games - is ~238watt which in theory would put 2x 470 in GX2 config at ~100watt over their figure. However baring in mind that (A) In normal gaming you don't use even close to this figure (B) In most cases SLI scaling isn't 100% so your gaining some there (C) there are some power savings to be made on a GX2 style setup the actual power consumption under average gaming would be closer to 330watt... tho obviously thats not a figure you can use for your hardware rating.
 
The card will take as much as wants from them as long as the cable/MB doesn't melt/go on fire and the PSU doesn't trip on overcurrent.
 
Those are some incredibly odd numbers there Rroff, the in toughest game numbers are roughly what we're seeing as their "rated" wattage, and that really is well over 200W, 225W isn't an unfair rating for actually gaming load, except, its rather bad form to randomly change how you rate power draw just because the real numbers you've used for decades paint a bad picture.

It uses quite a bit more than 238W maxed out, and 2 would only use 330W, thats incredible wishful thinking there.

The cards if both used to full power would use 500W +, with the fact that rarely does a sli/xfire actually double in power, you at best will save 50W and there MIGHT be another 50W saved from real power draw vs gaming load.

However, people WILL run furmark on the card and WILL load it to its real rating, so what you CALL the power draw doesn't matter, it can and will use the full load at some point and if every card dies due to overheating or massive throttling at stock in furmark, in reviews, they'll sell smeg all.

A 5870 uses 180W, but its gaming draw is closer to 150W, it doesn't matter if its called a 180W, a 150W, or a 7W card, it still has to be able to provide, and cool the REAL draw no matter what its actually called.

the uk [H] puts the 470gtx under furmark load at 77W above their 5870 at load, if the 5870 was using its full 180W that puts the 470gtx at 257W load, and those are the numbers I've seen around the place though for some reason Anand put the 470gtx MUCH closer to the 5870 under load, well, 35W difference, most websites have it higher and most agree the 480gtx is a good 100-120W difference.

I would guess either Anand got their 5870 numbers a touch wrong, or they got a 470gtx that uses less than average voltage, not sure if in the binning they managed to get a few lower voltage 470gtx's out there, would be less surprising if the cherry picked a few samples for reviews to be honest.

Anand however only puts a 30 or so watt difference between the 470GTX at full load in Crysis and full load in Furmark(largely because its easy to max out the Nvidia architecture as its "simple", and AMD's is harder to max out in games, and easy as hell in Furmark).

Anand review also showed why the 4870x2 was hilarious, it used about 10W more than a single 285GTX and 285GTX in SLI uses 710W, no idea why people are surprised how fast it was , it used 60% more power than a 4870x2.

AS for power, I've been saying for ages, you can use WELL over the 300W spec, it means nothing, and is not really even a safety issue(until you get well higher) the cables can draw far more than they are rated for. Whats recommended and what it can do are very different things.

I said ages ago, theres entirely NOTHING stopping them making a triple slot 3 quiet fan huge sink full 480gtx based dual gpu card. it will use a ridiculous amount of juice and be blindingly fast.... just, whats the point, even the die hardiest of die hard Nvidia fans will not want it, most wouldn't be able to afford it, most AIB's wouldn't want to make it and NO OEM would stick it in a prebuilt computer, which just makes it a niche product, theres nothing stopping them making a few and having them reviewed and putting it on sale with a dozen worldwide, no ones interested in it though so who cares.
 
Furmark


81648177.jpg




3Dmark06..lol hardly demanding by todays standards.


215qw.jpg
 
My own testing got the GTX470 power useage at a peak of ~238watt when 100% max GPU useage under the most stressful conditions imaginable... in your average game it actually drops quite a bit below that. I can't say thats 100% accurate as I used a low power GPU to get the base system wattage and guessed at the power consumption on the low power GPU - tho its right to 1-2 watts.

I haven't been able to test in SLI for power consumption but based on the figures on my 200 series SLI setup thats an approx. ballpark.



EDIT: As you can see from the images above - 275->295 does not increase the wattage as substantially as you might imagine in actual real useage.
 
Last edited:
Those are some incredibly odd numbers there Rroff, the in toughest game numbers are roughly what we're seeing as their "rated" wattage, and that really is well over 200W, 225W isn't an unfair rating for actually gaming load, except, its rather bad form to randomly change how you rate power draw just because the real numbers you've used for decades paint a bad picture.

It uses quite a bit more than 238W maxed out, and 2 would only use 330W, thats incredible wishful thinking there.

The cards if both used to full power would use 500W +, with the fact that rarely does a sli/xfire actually double in power, you at best will save 50W and there MIGHT be another 50W saved from real power draw vs gaming load.

However, people WILL run furmark on the card and WILL load it to its real rating, so what you CALL the power draw doesn't matter, it can and will use the full load at some point and if every card dies due to overheating or massive throttling at stock in furmark, in reviews, they'll sell smeg all.

A 5870 uses 180W, but its gaming draw is closer to 150W, it doesn't matter if its called a 180W, a 150W, or a 7W card, it still has to be able to provide, and cool the REAL draw no matter what its actually called.

the uk [H] puts the 470gtx under furmark load at 77W above their 5870 at load, if the 5870 was using its full 180W that puts the 470gtx at 257W load, and those are the numbers I've seen around the place though for some reason Anand put the 470gtx MUCH closer to the 5870 under load, well, 35W difference, most websites have it higher and most agree the 480gtx is a good 100-120W difference.

I would guess either Anand got their 5870 numbers a touch wrong, or they got a 470gtx that uses less than average voltage, not sure if in the binning they managed to get a few lower voltage 470gtx's out there, would be less surprising if the cherry picked a few samples for reviews to be honest.

Anand however only puts a 30 or so watt difference between the 470GTX at full load in Crysis and full load in Furmark(largely because its easy to max out the Nvidia architecture as its "simple", and AMD's is harder to max out in games, and easy as hell in Furmark).

Anand review also showed why the 4870x2 was hilarious, it used about 10W more than a single 285GTX and 285GTX in SLI uses 710W, no idea why people are surprised how fast it was , it used 60% more power than a 4870x2.

AS for power, I've been saying for ages, you can use WELL over the 300W spec, it means nothing, and is not really even a safety issue(until you get well higher) the cables can draw far more than they are rated for. Whats recommended and what it can do are very different things.

I said ages ago, theres entirely NOTHING stopping them making a triple slot 3 quiet fan huge sink full 480gtx based dual gpu card. it will use a ridiculous amount of juice and be blindingly fast.... just, whats the point, even the die hardiest of die hard Nvidia fans will not want it, most wouldn't be able to afford it, most AIB's wouldn't want to make it and NO OEM would stick it in a prebuilt computer, which just makes it a niche product, theres nothing stopping them making a few and having them reviewed and putting it on sale with a dozen worldwide, no ones interested in it though so who cares.

i would buy an nvidia Dx11 dual card.
 
tbh anyone who buys a multiGPU setup and worries about power consumption too much has to be nuts.

No, but its a question of power vs reliability, heat output, noise, and how much they charge for it all wrapped up into an equation, what they rate the power usage at, is rather irrelevant, what it actually uses is.

The problem is, at what point does a dual gpu card stop being easier than SLI, and start becoming a huge problem with severely less speed and a cost that makes it pointless.

Thats the problem a full dual 480gtx that ran at 70% fan minimum, that ran at 500W and was £1000 would interest no one, get 2 480gtx's and run SLI.

Would a 5970, slower than 2x5870's, be high on many peoples must buy list if it cost £700 and was painfully loud? No, it loses its point when it becomes more expensive and more hassle than two separate cards.

I said it very clearly, theres nothing stopping them making that card, and nothing physically preventing it from working, but you'd be nuts to run a dual 480gtx thats louder, overclocks worse, costs more and has a very good chance of being constantly throttled, over 2 separate 480gtx's.

As for power, the AMD dual cards scale pretty high in regards to power usage, and http://www.*****.net/content/item.php?item=24024&page=13 shows a 140 difference between a 275gtx and a 295gtx.

3dmark is also pretty heavily cpu limited now so I wouldn't take 3dmark 06 numbers as maxing out a 295gtx, considering furmark shows a larger difference, by some margin. In those same tests it shows a 470GTX to be 67W higher at load in furmark than a 5870, a 180W card.

You've also not mentioned that a 295GTX is NOT 2x275gtx's, but 2 downclocked and downspecced 275gtx's. If they were the same spec, thats still a 50% power increase, they AREN'T the same spec, but at least 10% lower clocks, its going to be using over 10% less power per core, so really the scaling is closer to a 70% power increase,

a 232W 470GTX(lowest rating I've seen in any review) x 1.7 - 400W. Even at the ludicrously inaccurate and low scaling of 50% more, its still a 350W card absolute minimum, but you're pulling 330W out of absolutely no where.

Sure, a 2x470gtx card will simply use 10W more than a 480gtx, no really, that makes perfect sense.
 
http://www.hardocp.com/article/2010/03/26/nvidia_fermi_gtx_470_480_sli_review/7

http://www.hardocp.com/article/2010/05/10/galaxy_geforce_gtx_470_gc_sli_review/8

Two very important pieces of information, see how in Furmark a 2nd 480gtx add's 100W of power........ yet on the second link a 2nd 470gtx increase the system load some 200-250W MORE than single card, IN A GAME. Infact in Metro 2033, where SLi scales brilliantly.

So what can we draw from that, at least on one version of Furmark(would appear to be newer ones) 4xx series cards are without question not using the 2nd card effectively AT ALL.

So you can quite literally throw out Furmark results in SLI for power, its simply not working, every single result on the web suggest this.

As for SLI, the 470gtx shows a 200W bump in SLI, at least, a minor minor overclock on the Galaxy brings insane power increase(put some of that down to the fan though).

Sli/dual cards scale very high in terms of power draw, Furmark and a cpu limited benchmark thats 4 years old are not at all good indicators of this. [H] are rather stupid for suggesting the 480gtx isn't a 300W card because a second card doesn't add more than 100W, infact, thats retarded, especially as a single 480gtx is using over 100W more than a single 5870 180W card.

Nvidia numbers, are bull, Furmark is crap, always has been crap, always will be crap, is pointless, proves nothing and shouldn't ever be used and it CERTAINLY isn't working in SLI mode.

The big problem is, Furmark does produce an abnormal load, very little AMD cards do will produce similar loads, advertising your card as a GPGPU that can do physx and lots of other hard loading which WILL use more than normal gaming power usage, is a problem because the cards clearly use quite a bit more than Nvidia rates them at for things they are designed to be used for.

If you go back through any of my threads, btw, I've never been a fan of furmark and bash it every chance I get, so this isn't a turn around for me, but vindication. The biggest issue I have is, you can be 100% game stable in every game ever made at one overclock, but it can crash Furmark, yet loads of people won't run an overclock unstable in furmark, its an unrealistic app that now also clearly doesn't work very well to reflect SLI power/performance/stability. I dare say overclocked stability in SLI on furmark proves nothing, if its not using the other card properly.

EDIT:- More furmark, simply, madness http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/19

Anandtech, 480 normal and sli in Crysis, system load, 421 vs 668, so a 247W increase(and no that isn't perfect scaling, sli doesn't use 100% more power either)........... the same comparison in Furmark from Anand 479 vs 851 so its gone from a 247W gap, to a 372W gap, and its not like the and the single card output went up itself by 60W, it was another 372W after that.

The only obvious conclusion to draw is, Furmark is no where near the be all and end all of power draw comparisons, it shows MASSIVELY different results in different systems, Anand stupidly didn't say what version they used so you can't really see if its dodgey versions, but theres no consistancy AT ALL.

I mean, Anand show 372W difference, and [H] show a 100W difference..... superbinned 480gtx's, I don't think so, especially as their 470gtx in game power load difference was massive.
 
Last edited:
http://www.hardocp.com/article/2010/03/26/nvidia_fermi_gtx_470_480_sli_review/7

http://www.hardocp.com/article/2010/05/10/galaxy_geforce_gtx_470_gc_sli_review/8

Two very important pieces of information, see how in Furmark a 2nd 480gtx add's 100W of power........ yet on the second link a 2nd 470gtx increase the system load some 200-250W MORE than single card, IN A GAME. Infact in Metro 2033, where SLi scales brilliantly.

So what can we draw from that, at least on one version of Furmark(would appear to be newer ones) 4xx series cards are without question not using the 2nd card effectively AT ALL.

So you can quite literally throw out Furmark results in SLI for power, its simply not working, every single result on the web suggest this.

As for SLI, the 470gtx shows a 200W bump in SLI, at least, a minor minor overclock on the Galaxy brings insane power increase(put some of that down to the fan though).

Sli/dual cards scale very high in terms of power draw, Furmark and a cpu limited benchmark thats 4 years old are not at all good indicators of this. [H] are rather stupid for suggesting the 480gtx isn't a 300W card because a second card doesn't add more than 100W, infact, thats retarded, especially as a single 480gtx is using over 100W more than a single 5870 180W card.

Nvidia numbers, are bull, Furmark is crap, always has been crap, always will be crap, is pointless, proves nothing and shouldn't ever be used and it CERTAINLY isn't working in SLI mode.

The big problem is, Furmark does produce an abnormal load, very little AMD cards do will produce similar loads, advertising your card as a GPGPU that can do physx and lots of other hard loading which WILL use more than normal gaming power usage, is a problem because the cards clearly use quite a bit more than Nvidia rates them at for things they are designed to be used for.

If you go back through any of my threads, btw, I've never been a fan of furmark and bash it every chance I get, so this isn't a turn around for me, but vindication. The biggest issue I have is, you can be 100% game stable in every game ever made at one overclock, but it can crash Furmark, yet loads of people won't run an overclock unstable in furmark, its an unrealistic app that now also clearly doesn't work very well to reflect SLI power/performance/stability. I dare say overclocked stability in SLI on furmark proves nothing, if its not using the other card properly.

EDIT:- More furmark, simply, madness http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/19

Anandtech, 480 normal and sli in Crysis, system load, 421 vs 668, so a 247W increase(and no that isn't perfect scaling, sli doesn't use 100% more power either)........... the same comparison in Furmark from Anand 479 vs 851 so its gone from a 247W gap, to a 372W gap, and its not like the and the single card output went up itself by 60W, it was another 372W after that.

The only obvious conclusion to draw is, Furmark is no where near the be all and end all of power draw comparisons, it shows MASSIVELY different results in different systems, Anand stupidly didn't say what version they used so you can't really see if its dodgey versions, but theres no consistancy AT ALL.

I mean, Anand show 372W difference, and [H] show a 100W difference..... superbinned 480gtx's, I don't think so, especially as their 470gtx in game power load difference was massive.

i hear some people getting micro stutter on ATi cards...is this true or not as i was considering buying an ATI card but i hear some people say there great and some people say they would never ever buy ATI again..whats the real deal on this?
 
AFAIK 2 card CF doesn't have noticeable microstutter but there are some issues with 3 and 4 card configurations.
 
Back
Top Bottom