MSI Big Bang *Official* Motherboard thread (Lucid Hydra 200)

MSI Rep
Vendor Rep
Joined
7 Oct 2008
Posts
107
Ok well news seems to be popping up places so thought it best start a thread about what will be one of the most important gaming boards for some time.

What is Lucid Hydra 200?

Its a chip created by LucidLogix which allows the harmonious use of ATI and NVIDIA technologies in a single PC. Yes SLI and Crossfire. CrossSLI or SLIfire :D

Is there not more?

This the partnership between MSI and Lucid technology also allows miss-matched speed cards to be used together. I.e. have an old 9800GT in your cupboard from when you got your 285 GTX? Slap it in and get the extra juice. Not only that, if its a 9800 Pro (ATI) that would work too!

What is MSI Big Bang?


This is the first motherboard to use this technology, allowing gamers everywhere to have more flexibility on the hardware setup they use in systems

This can't be true? Show me images I need proof?

Taken from Anantech article:

lucidchip.jpg


hydrasetup.jpg


nvati.jpg


nvati2.jpg
 
Soldato
Joined
19 Dec 2003
Posts
7,050
Location
Grimsby, UK
Not so bad if you got cards gathring dust from previous setups maybe, but I always like to have the best single gpu card available to me.

I dont have any problems in any game running at 1920x1200 so dont really see the point in this from a personal view.

Will be interesting to see how much juice in terms of watts this will use.
 
MSI Rep
Vendor Rep
OP
Joined
7 Oct 2008
Posts
107
Not so bad if you got cards gathring dust from previous setups maybe, but I always like to have the best single gpu card available to me.

This is more aimed for those that would usually run a SLI or crossfire rig.

I dont have any problems in any game running at 1920x1200 so dont really see the point in this from a personal view.

Agreed, but similar argument to saying what is the point in SLI (which is a fair point), but his is aimed at the customer that would benefit from the extra flexibility.

Will be interesting to see how much juice in terms of watts this will use.

No reason that it should use any more than a typical dual card setup. The Lucid chip will likely use <10W which will be negligible with a couple of cards in there.
 
Associate
Joined
8 Nov 2007
Posts
423
Location
London and Florence, Italy
If this works, it's totally going to blow everything open.

You have a GTX 295 and need a bit more oomph? Get a 5870 as well! Then when the next card comes out, add that too (running in 1x16, 2x8) or replace one at random!

It'll open the floodgates to true multi-card graphics setups.

Of course, I'm really sceptical until I see it actually working!
 
Associate
Joined
16 Jan 2005
Posts
641
Location
Laaaandan
The cards that you plug in will have to support the same spec - I'll be money that you won't be able to use one card which supports one feature and another card which lacks the feature!

This is also gonna play havoc with games which detect the type of card and then use a certain render path (esp TWIMTBP games)!
 
Man of Honour
Joined
13 Oct 2006
Posts
83,168
Using it with 2 very different cards is going to have odd results...

i.e. the example of a 9800GT and ATI 9800 pro above... cards from different generations are going to have very different features sets... and even between 2 cards of the same generation 1x ATI and 1x nVidia its going to have some pretty odd results as they have fairly different color balance/brightness and filtering, etc. so you'd end up with a rather odd looking composited scene with certain objects looking quite out of place...

Now if I could throw a 295GTX in and use it in conjunction with one of my existing 260GTX for some 3 way goodness that could be half useful... or even get a board with 3x PCI-e x16 slots and use all 4 cores with better scaling than quad SLI :D

EDIT: Wonder if there will be a board with 4x slots hehe... 4x 295GTX any one?
 
Associate
Joined
21 Sep 2003
Posts
1,269
Location
Leicester
I just found out about this - WOW is all I can say.

If it really works, then this is going to be huge. Ive never even considered SLI or Crossfire, but this will sway me to go multi-GPU IF is works, that is.

The thing thats unique about this and which Im sceptical about is that it doesnt share alternate frames between gpu's or half the screen is rendered by one and the other half by the other, from what they say it will determine which GPU should render what down to the polygons. In theory this should allow a newer card to provide its improved texture filtering and AA for example, whilst the other card, which doesnt support these things, can handle some of the basic on screen models.

But Im worried about input lag, potentially this could add loads on.

Guess we'll wait and see, hope its as good as they say though! SLI/Crossfire are both huge disappoints in my eyes.
 

AMG

AMG

Soldato
Joined
18 Aug 2008
Posts
4,700
Location
lincs, spalding
:p, too bad nvidia kills physx and cuba if ATI card is about :(


it is a good idea, but it came about late, unless Nvidia unlock it (hint hint;))
 
Associate
Joined
25 Feb 2007
Posts
2,058
Location
Bedfordshire
I saw this a while back and it looked impressive then. Scalling then was reported to be 100% as Jokester said. I will have to wait for the reviews and benchmarks, but this could be massive. 4x 5870x2's anyone :p

EDIT: Humm strange, why do they have both of the cards plugged into the monitor?
 
Man of Honour
Joined
13 Oct 2006
Posts
83,168
EDIT: Humm strange, why do they have both of the cards plugged into the monitor?

Probably because theres a bug in vista and possibly windows 7 that requires you to have any card used for rendering or similiar work load plugged into something... anything... I believe there are registry work arounds to fix this tho.
 
Associate
Joined
11 Aug 2007
Posts
420
Location
Manchester
Why does Nvidia kill of CUDA and Physix when the system detects a competitor video card installed in the system?? That's utter ******** from Nvidia. :(
 

AMG

AMG

Soldato
Joined
18 Aug 2008
Posts
4,700
Location
lincs, spalding
Why does Nvidia kill of CUDA and Physix when the system detects a competitor video card installed in the system?? That's utter ******** from Nvidia. :(

tell me about it, use old drivers ?


and I don't think its possible to fit 4 X2s ;p
 
Top Bottom