• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 595 at CES

I do not know of any rumours that a dual card will ever have 580 chips in them and if they did it would be very impractical. Sticking two perfectly good full yield cores in a card like that which would require them to be cut down or downclocked severely is nonsensical, they don't even have good enough quality yields to keep ordinary stock.

No hope in hell it would come in under the price of a 580 if it was effectively 580's in SLI...
Two 570's would make a massive lose being priced under a 580 as well and without it being atleast two 570's it will not compete with the 6990's which are supposedly based on 6970's.

Sure people will buy it but Nvidia would be mental to make it as rumours suggest.
AMD have a dual card in their business model and their single chip cards are designed with it in mind, Nvidia's would be a bodge.
 
I can't see a dual 6970 based card being much more viable than a dual 580 or dual 570 - they are only ~18% less power hungry than the 580s and less than 2% cooler running - sure with 2 cards that quite a saving but they are still well above the envelope for a dual card - at best it will be a hugely underclocked pair of cores.
 
LOL

2% cooler means nothing, its about the heat generated not running temp.

2% lower on die temp, but producing 50% more heat ?
 
Last edited:
I can't see a dual 6970 based card being much more viable than a dual 580 or dual 570 - they are only ~18% less power hungry than the 580s and less than 2% cooler running - sure with 2 cards that quite a saving but they are still well above the envelope for a dual card - at best it will be a hugely underclocked pair of cores.

I think you are confusing something being possible and something being viable.
 
Given the respective TDPs its not going to be hugely different.


Not hugely in that it would be between 200-400 watts above the ATI card :p

The GTX580 GX2 will have to be uber fast to compete. But we are we even talking about it ? Nvidia cant even manufacture GTX580's, never mind a GX2 version. Its all pie in the sky again, even if the do a release the GTX580 GX2 you will struggle to anything more than pre order it for 12 months.
 
200-400 watts, more trolling spam...

Now considering the below power consumption chart is using furmark to net results, I think it's safe to assume that if AMD can do a 6990 with 6970/6950 cores then Nvidia can do a dual GPU with 580/570 cores, it's that simple.

 
^^ yeah not sure where hes getting his figures from as the true TDP for the 6970 in games is around 200-210 at stock and the GTX580 225-240 depending on which source you believe - which would put the difference assuming 2x full fat cores well under 100watt.

I don't see the 6990 out yet either so not sure what high horse hes riding.

I'm not sure if nVidia can win on performance within the thermal/electrical envelope for a dual GPU high end card but they seem to be concentrating quite a bit of effort into SLI performance with the next couple of drivers so I'm assuming they are gonna try.
 
Last edited:
Are these consumption figures peak or typical?
6970 is rated as typically 190W and 580 as 244W I believe.
Min. PSU recommended for 6970 is 550W, for 580 its 600W.


Edit, says 550W for 6970 on AMD site and 500W for 6950.
 
Last edited:
Considering Anandtech(who are in themselves completely stupid, http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/24 ) show a 49W difference in Crysis and a 130W in Furmark without the 580gtx furmark limit removed.

Basically at "default settings" for both cards, IE no power applications changed, the 6970 will not use more than circa 200W, just below mostly, the 580gtx will use up to 300W.

The ONLY time they come close is when some website uses any uber high loading benchmark or furmark, removes the limit ONLY on the AMD card, and then compares to a 580gtx, even then the 580gtx comes out using more power.

In realistic situations the 6970 uses quite a lot less power, and it has more memory on board which makes another 5-10W of difference aswell.


As for cards, why do people consistantly across loads of forums suggest full cores won't get used in these cards as its a waste.

for X performance figure, whatever performance target they want, more cores and a lower clock and voltage will always, always use less power.

Why did the 295gtx use 240shader cores and reduce other more power hungry aspects of the card like bus, because a 216 shader core would need increased clocks and increased stock voltage to hit the same performance level, which would have used more power.

Almost every dual gpu card ever made uses ALL the cores it can at as low clocks as it can, because its the single best way, the ONLY option for the lowest power usage possible.

You either have, as an example, with a 580gtx, 2x512 cores, at 1.0v at 550Mhz, or 2x480 cores at 1.05v and 605Mhz, both will offer the same performance, the former option will use less power.

Any dual Nvidia card, if it comes is almost certain to be a 2x512sp gf110 based card, or a 2x384sp gf114 based card. Likewise Antilles will be either 2x1536 Cayman shaders, or 2x 1120 cypress shaders(2x6870's).

Rroff its worth noting that the only people that claimed there wasa dual gf110 card coming, insisted BEYOND a doubt it would be out BEFORE Xmas, and they insisted this for 2 months straight, untill there was a week left then all of a sudden they started insisting it was delayed for no apparent reason as it was completely ready.


As for 2x580gtx's using at max 100W more, aside from not a single review I've seen agreeing with that, you fail to see the problem, if the 6990 uses 299W, then 100W more would put the Nvidia duallie at 399W, or, something they wouldn't make, 3rd party silly low volume special edition cards, sure, not many people really care about the 300W limit, except those that break standards and those who want to sell a card in high volume. IE Nvidia, AMD, Dell, Hp, etc, care about the 300W, and no one else gives a monkeys.

EVGA have long been a "release something ridiculous we can't release ourselves as we'll look mental" backdoor product launch company for Nvidia, but even so the EVGA leaked shots still look likt gf114 cards.


Anyway, who cares, I mean really, AMD dual high end cards offer good value(in general) but who can't build a system to just take 2 cards with improved cooling and overclocking anyway, theres little compelling reason to buy a dual gpu single pcb card at all.
 
Are these consumption figures peak or typical?
6970 is rated as typically 190W and 580 as 244W I believe.
Min. PSU recommended for 6970 is 550W, for 580 its 600W.


Edit, says 550W for 6970 on AMD site and 500W for 6950.

Whichever you go for - and your figures are about as correct as mine give or take for actual useage - it doesn't come even close to 200 let alone 400 different for a multi GPU setup.
 
Considering Anandtech(who are in themselves completely stupid, http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/24 ) show a 49W difference in Crysis and a 130W in Furmark without the 580gtx furmark limit removed.

Basically at "default settings" for both cards, IE no power applications changed, the 6970 will not use more than circa 200W, just below mostly, the 580gtx will use up to 300W.

The ONLY time they come close is when some website uses any uber high loading benchmark or furmark, removes the limit ONLY on the AMD card, and then compares to a 580gtx, even then the 580gtx comes out using more power.

In realistic situations the 6970 uses quite a lot less power, and it has more memory on board which makes another 5-10W of difference aswell.


As for cards, why do people consistantly across loads of forums suggest full cores won't get used in these cards as its a waste.

for X performance figure, whatever performance target they want, more cores and a lower clock and voltage will always, always use less power.

Why did the 295gtx use 240shader cores and reduce other more power hungry aspects of the card like bus, because a 216 shader core would need increased clocks and increased stock voltage to hit the same performance level, which would have used more power.

Almost every dual gpu card ever made uses ALL the cores it can at as low clocks as it can, because its the single best way, the ONLY option for the lowest power usage possible.

You either have, as an example, with a 580gtx, 2x512 cores, at 1.0v at 550Mhz, or 2x480 cores at 1.05v and 605Mhz, both will offer the same performance, the former option will use less power.

Any dual Nvidia card, if it comes is almost certain to be a 2x512sp gf110 based card, or a 2x384sp gf114 based card. Likewise Antilles will be either 2x1536 Cayman shaders, or 2x 1120 cypress shaders(2x6870's).

Rroff its worth noting that the only people that claimed there wasa dual gf110 card coming, insisted BEYOND a doubt it would be out BEFORE Xmas, and they insisted this for 2 months straight, untill there was a week left then all of a sudden they started insisting it was delayed for no apparent reason as it was completely ready.


As for 2x580gtx's using at max 100W more, aside from not a single review I've seen agreeing with that, you fail to see the problem, if the 6990 uses 299W, then 100W more would put the Nvidia duallie at 399W, or, something they wouldn't make, 3rd party silly low volume special edition cards, sure, not many people really care about the 300W limit, except those that break standards and those who want to sell a card in high volume. IE Nvidia, AMD, Dell, Hp, etc, care about the 300W, and no one else gives a monkeys.

EVGA have long been a "release something ridiculous we can't release ourselves as we'll look mental" backdoor product launch company for Nvidia, but even so the EVGA leaked shots still look likt gf114 cards.


Anyway, who cares, I mean really, AMD dual high end cards offer good value(in general) but who can't build a system to just take 2 cards with improved cooling and overclocking anyway, theres little compelling reason to buy a dual gpu single pcb card at all.

Furmark has little impact on viability for gaming - both companies will simply put a limiter on to keep it within check for that - actual gaming and even intensive benchmarks are a far different story.
 
It'll be interesting to see what Nvidia can do with a dual-GPU Fermi. Honestly though, I can't see anything they produce being able to match Antilles.

This dual-GPU round is all about power efficiency. Sure, you can push the 300W card power limit, but not too far. Neither side can afford to run a pair of their top-end GPUs at full capacity, so it comes down to the compromises available. In this AMD has the advantage, mainly due to the power management system:

The power containment system that AMD implements will allow them to set the maximum power draw they are comfortable releasing the card with, then let the GPUs loose anywhere below this. The clockspeed is gradually adjusted, so that GPU load is smoothly reduced or increased. The power containment system on the GTX580 is far less sophisticated, and seems to only kick in on certain programs (i.e. furmark). When a power draw limit is reached, it periodically drops the clockspeed down to a lower 'safe' level, and then brings it back up again. If this same mechanism kicks in during gameplay then it could lead to stuttering and other unpleasantness.

Now, I'm sure that Nvidia are working on a more effective system for power limitation, and maybe they will have it in place for the GTX595. If so then I think it's entirely plausible that we could see a dual GTX570 setup, which could compete well with the 6990. If not, then I'm sure that dual GF114s will be the only option. These should outperform the GTX580 by a fair margin, but wouldn't be able to match the 6990.


Power management aside, AMD still has a slight advantage in the raw performance-per-Watt stakes (the 6970 is perhaps a little faster than the 570 overall, and uses a little less power). Also crossfire scaling in most titles is now a little better than SLI. This gives AMD a bit of headroom in case Nvidia has their own sophisticated power management system hidden up their sleeve.
 
Back
Top Bottom