I see how you like taking potshots bringing a different topic to this thread. Well I'll play this once:
That was it. Before long you were accusing me of telling some one that he could use his 3Dvision kit with Tridef.
COAB you couldn't make this up.
Which you actually did say in so many posts, and we tried to explain to you that you can't -- which was crucial in the argument because Harpss1ngh had 3D vision kit +monitor which is why AMD was not an option for him. You were failing to realise that 3D Vision® is not just stereo 3D. It's a registered trademark of NVIDIA and it's the name used to describe NVIDIA's stereo 3D tech. The generic name is Stereo3D not 3D Vision®.
You also made statements like this that are patently false:
Tridef? probably, but you would be a bit silly because if you have Nvidia stuff then you probably have an Nvidia card.
All Nvidia's kit does is give you a transmitter, glasses, and driver that would work in the exact same way as Tridef.
3D is 3D. Nvidia's method is identical to AMD's, only AMD don't sell a kit.
http://www.tridef.com/home.html
Basically that and a monitor like the one I found would work on any GPU. Even something like a 3Dlabs or Tesla ETC.
To which I posted responses like this which are correct:
No it's not. There are a number of competing and incompatible standards. The 3D glasses on my TV don't work with the 3D glasses on my monitor (which is 3D VISION -- it's a brandname) and neither will work with Samsung. In fact my LG 3D Plasma's glasses won't work with the previous gen LG glasses/display -- and none of these work with ANY passive glasses and screens. NVIDIA 3D Vision 1.0 and 2.0 however, do interoperate, but 3D Vision 2 is better.
And all this, even after you had admitted to your ignorance of the topic at hand:
You can 3D with AMD cards. I'm not exactly sure how you do it as they're not very clear, but yes, you can 3D with AMD cards.
Anyway there were only two possibilities:
1) Either you realised that 3D Vision does not work with AMD but you were still telling him to get an AMD card -- which is VERY BAD advice (as it would render his existing gear useless) and you were simply there to give advice and sound smart on a topic you didn't know that much about
2) You didn't realise it doesn't work and were harping on about AMD's 3D (which really isn't AMD's 3D... It's just a bunch of third-party stuff that word work together anyway whether or not AMD's in the picture. Hell, you could get an NVIDIA card and still use that 3D.)
Either way, your argument looks bad. But I was there and I know the second case was the correct one. And I'm sure Harpsing can confirm it.
And anyone who cares to read every post in that thread could easily confirm it for themselves.
But then you changed your tune when you started to realised the ship you were on had a hole in its hull, and the rats were already abandoning it like someone let a big one rip.
Sorry man yeah you're right. If I use shutter 3D I throw up in less than five minutes.
I forgot how awesome it was to have a technology that makes me incredibly sick.
And yes, I am being serious. For some of us shutter flicker is very noticeable, ruling it out completely.
Any way going back to the discussion
http://www.bit-tech.net/hardware/graphics/2006/05/01/quad_sli_geforce_7900_gx2/13
It was actually Nvidia who decided to revive multiple GPU cards first. And the 7900GX2 in quad sli was a complete and total flop.
NVIDIA also supports passive 3D. NVIDIA's 3DTV Play software is cheaper than Tri-def and it works with third-party 3D solutions like an active or passive 3D display technology. Alternatively you can just use a passive or active 3D monitor/tv with its own tech such as iZ3D or with tri-def.
NVIDIA's solution simply comes in, and is relevant, where users want a superior ACTIVE 3D solution where you can expect a certain standard of quality.