• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2x gtx 580 msi lightning xtreme edition 3gb

Associate
Joined
22 May 2011
Posts
47
I already ordered one gtx580 lightning 3gb edition, and I'm planning to buy another one in the future cos i'm still short in money. errm my question is, my PSU is corsair ax850, can it support these two monster card? :confused:
 
You should be ok, but dont run prime and Furmark together, Andandtech show load consumption as 650w with Crysis.
 
I've measured around 725W peak from the wall when I run Prime95 with 5 threads (low priority) and run Unigine Heaven Benchmark concurrently, on an i7 980X and a pair of GTX580 with an AX1200.

But that's only my adhoc readings. Use it as a reference only, at your own risk.
 
for 1920x1080 3 monitors, you can use 560gtx in sli. = fine.
580gtx 3gb lightning card on a SINGLE 1920x1080 monitor is just...like buying a 6990 for a htpc.
 
thanks. :)

as of now, I've got a dell u2410 monitor. but I'll be getting a new monitor as soon as the 27" samsung a750 or a950 120hz release. but I'm still not sure if i will get another 580, I'm gonna wait for the monitor and see if this card run smooth on all games and then decide if I should go for another 580 card. :S
 
thanks. :)

as of now, I've got a dell u2410 monitor. but I'll be getting a new monitor as soon as the 27" samsung a750 or a950 120hz release. but I'm still not sure if i will get another 580, I'm gonna wait for the monitor and see if this card run smooth on all games and then decide if I should go for another 580 card. :S

Best thing I think. Though as I said, 28nm is not that far off in the scale of things, so 'struggling' on with a single GTX580 until then is probably not a bad solution.
 
thanks. :)

as of now, I've got a dell u2410 monitor. but I'll be getting a new monitor as soon as the 27" samsung a750 or a950 120hz release. but I'm still not sure if i will get another 580, I'm gonna wait for the monitor and see if this card run smooth on all games and then decide if I should go for another 580 card. :S

A single GTX580 won't be able to handle the notorious titles such like Metro 2033, Shogun 2, The Witcher 2 etc. If you stay away from these then you don't need another GTX580.
 
Why do people always have such trouble with Shogun 2? My 5870 runs it perfectly fine with everything set to Ultra (2xAA though). Tendency of the game to favour AMD/ATI cards aside I would have thought a single GTX580 would have charged through it at 1200p
 
It's not "always." Apparently cards struggle if you're running ridiculous levels of AA and stuff with a tri-SLI set up and you might run out of VRAM because of this. For any normal person, set up or situation, it's fine.
 
I've Witcher 2 to be very GPU intense ! Can't believe how much it flogs my 5850 (flat out)

For the OP I'd stick with your plan f waiting to see what the monitors are like plus if theres anything nearer that will take you fancy.
 
It's not "always." Apparently cards struggle if you're running ridiculous levels of AA and stuff with a tri-SLI set up and you might run out of VRAM because of this. For any normal person, set up or situation, it's fine.

Ah right. My bad, It's just I've seen a tendency of people on this forum to add Shogun 2 to the Metro 2033 'get more Vram' pile.
 
Why do people always have such trouble with Shogun 2? My 5870 runs it perfectly fine with everything set to Ultra (2xAA though). Tendency of the game to favour AMD/ATI cards aside I would have thought a single GTX580 would have charged through it at 1200p

It's because if you don't have enough vram and you don't modify a script file then the game would automatically reduce graphics settings.

http://forums.totalwar.com/showthread.php/17993-Are-you-REALLY-using-Ultra

By the way, this game is an AMD's game, and The Creative Assembly did not betray AMD like what Codemasters did in Dirt 3. Thus GTX580 is having a big trouble with the game.
 
It's because if you don't have enough vram and you don't modify a script file then the game would automatically reduce graphics settings.

http://forums.totalwar.com/showthread.php/17993-Are-you-REALLY-using-Ultra

By the way, this game is an AMD's game, and The Creative Assembly did not betray AMD like what Codemasters did in Dirt 3. Thus GTX580 is having a big trouble with the game.

Betray?? Isn't in their best interest to make sure their game works with all current gcards? Otherwise that's a bit of an underhanded tactic.
 
Betray?? Isn't in their best interest to make sure their game works with all current gcards? Otherwise that's a bit of an underhanded tactic.

I don't think so. Marketing is important to companies. When you receive funding from nVidia, then you are supposed to de-optimize it for AMD cards. Look at how HAWX/HAWX2 worked. I still remember that on the same day AMD first fixed flickering of CrossfireX in Crysis 2, Crytek immediately released a patch to continue enable flickering on AMD cards.

What did AMD do recently? Look at Dragon Age 2, how it hammered nVidia cards at launch, and until when did nVidia release a driver to fix that. Shogun 2 is another example - nVidia cards still struggle, including GTX580 in SLI.

Can't say I like competition in such way. But the way how Codemasters double crosses is questionable rather than honourable, and AMD should be more cautious about making deals with them in the future.
 
Last edited:
Back
Top Bottom