• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Two graphics cards and .not. SLI or Crossfire...

Associate
Joined
13 Apr 2007
Posts
172
Location
Wolverhampton
Hi,

Recently, I've been wondering, is it possible to have a dual graphics card set-up with a pair of totally unmatched cards?

I'm thinking like in my situation;
I have two x16 pci-e slots on my motherboard (see sig.) which are designed for crossfire, but what would the configuration be like if I had, say, a 6600GT in one slot, and a HD 2900XT in the other, obviously they can't work together in some sort of SLI/Crossfire set-up as they're totally different cards, but what would the drivers be like to set up?
I mean, would it even work?

If so, would there be any benefit of having that in a dual monitor setup, like, having a HD 2900XT running my main monitor for gaming, and having the 6600GT running a widescreen monitor for High Resolution video/movies instead of having both monitors running from the outputs of the HD 2900XT?

Anyone even bothered trying this before?

Just curious,

Banjo ;_;
 
i would like this question answering too, i currently have my main monitor and a hdtv connected and might stick in a further monitor and have an x800 lying around i spose with them both being ati there should be less or an issue but i still dont know, havent found the time just yet to test it but if someone could advise before hand that would be cool.

i also have a question to add, having the tv on the main card atm i have to swap the displays (display manager/ CCC) to use the tv as the main (if you understand what im saying) what i would like is to clone the desktop but each screen retain its proper resolution, currently that does not work as the big resolution of my main monitor gets reduced to match that of the tv, still with me??

any advice would be great for me and the OP (sorry if im intruding)!!

thanks again
 
Banjo said:
Hi,

Recently, I've been wondering, is it possible to have a dual graphics card set-up with a pair of totally unmatched cards?

I'm thinking like in my situation;
I have two x16 pci-e slots on my motherboard (see sig.) which are designed for crossfire, but what would the configuration be like if I had, say, a 6600GT in one slot, and a HD 2900XT in the other, obviously they can't work together in some sort of SLI/Crossfire set-up as they're totally different cards, but what would the drivers be like to set up?
I mean, would it even work?

If so, would there be any benefit of having that in a dual monitor setup, like, having a HD 2900XT running my main monitor for gaming, and having the 6600GT running a widescreen monitor for High Resolution video/movies instead of having both monitors running from the outputs of the HD 2900XT?

Anyone even bothered trying this before?

Just curious,

Banjo ;_;


Not 100% sure about this, but i know you can run a AGP and PCI card on a asrock motherboard at the same time not sure which one tho.

Plus allso you can run a HD 2900 XT + any x1000 series cards with each other so that the 2900 just does the rendering (on screen visuals) and the extra card would do the physicis calculations etc so it would mean the 2900 has more speed to just do one job instead of two.

But thats not what your asking so i dont know really.


Allso one HD 2900 XT would be better for dual monitor anyway rather than having a 6600 for another monitor drinking more electricity etc
 
Darg said:
Haha.. so Ageia cards truly are pointless now.. :p


yup well and truely, ageia cards only work on ageia games, where as ati phyisics works on all 3d applications which involve phyisics which would normally run on a graphics or cpu, it will put it on the 2nd card :)
 
You certainly can run two dissimilar cards to support multiple monitors. The driver is bound to the card so it shouldn't matter if you have, say, an ATi in one slot and a Matrox in the other. It should all be hunky-dory.
 
Hehe, thanks everyone, once I get a HD 2900XT (in the next week) I'll have to give it a try :D

IceShock said:
Plus allso you can run a HD 2900 XT + any x1000 series cards with each other so that the 2900 just does the rendering (on screen visuals) and the extra card would do the physicis calculations etc so it would mean the 2900 has more speed to just do one job instead of two.
That's intrigued me, when you say ant x1000 series card, does the card I have doing physics calculations have to be a high performance card, like a x1950 Pro, or could i get a x1050/x1550 to do the job quite well too?
Just wandering because if I'd get a decent return from a £25 x1050 then I might well do that?
I'm guessing I'd get better physics performance with something like an x1950 Pro though?

Also, Dan2kx, I don't think it's possible to clone a display across a monitor and a TV with different resolutions, I'm not certain but I know it's not possible with nVidia's drivers :confused:

Thanks for all the replies so far,

Banjo ;_;
 
While it is nice to say the secondary card will perform the physics - NO games support it, and there aren't any planned. While Unreal Engine 3 supports the Ageia PhysX out of the box.

Second card will only ever be used for your extra monitors :)
 
Boogle said:
While it is nice to say the secondary card will perform the physics - NO games support it, and there aren't any planned. While Unreal Engine 3 supports the Ageia PhysX out of the box.

Second card will only ever be used for your extra monitors :)

Im pretty sure all havok based games take advantage of this setup.
 
SS-89 said:
Im pretty sure all havok based games take advantage of this setup.

Unfortunately not. There is not one single game that works with the secondary card running as a 'physics card'. If it could just be toggled on, then there would be an option in the drivers.

Tis why there are benchmarks and tests showing improvement with the PhysX, but not a lick of info on the new physics features supposedly supported by NV/ATI.
 
Fair enough but physx is not a worthwile investment imo unless it falls to about £50-£60 then i may consider getting one but i'd still be a bit reluctant.
 
xfire im guessing you mean crossfire? Well then no because that is ati tech and your running an nvidia chipset which uses sli. Might be possible with hacked drivers but performance wont be the same.
 
Back
Top Bottom