• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA PsyOps Against HD 7970

That was exactly how they managed to kill 3DFX. The 6000 never even released, yet according to Nvidia it was rubbish.

This, Truth be told 3DFX was the best i never once had a problem with glide, It was there fault also they didn't go and support Direct X or they would have still been here instead of being bought out by Nvidia and are just a 3rd wheel in their engine now.:o
 
This, Truth be told 3DFX was the best i never once had a problem with glide, It was there fault also they didn't go and support Direct X or they would have still been here instead of being bought out by Nvidia and are just a 3rd wheel in their engine now.:o

IIRC Nvidia's whole marketing line was -

"Multiple GPU set ups are absolutely rubbish and require software support. Our GPUs have a faster processor for faster gaming".

Hilariously they seem to have done a complete U-turn in recent years, relying on multiple GPU single cards (and multiple cards) to stay in the running with AMD (7 series dual gpu card, 9800gx2 295 and so on).

Not only that, but in a completely hilarious twist of irony they are also pimping quad SLI. Something 3DFX were set to do in 2001. :D

Yet Nvidia said it was all rubbish, so surely it must have been? :D
 
IIRC Nvidia's whole marketing line was -

"Multiple GPU set ups are absolutely rubbish and require software support. Our GPUs have a faster processor for faster gaming".

Hilariously they seem to have done a complete U-turn in recent years, relying on multiple GPU single cards (and multiple cards) to stay in the running with AMD (7 series dual gpu card, 9800gx2 295 and so on).

Not only that, but in a completely hilarious twist of irony they are also pimping quad SLI. Something 3DFX were set to do in 2001. :D

Yet Nvidia said it was all rubbish, so surely it must have been? :D

Atleast they're moving with the times ;)
 
IIRC Nvidia's whole marketing line was -

"Multiple GPU set ups are absolutely rubbish and require software support. Our GPUs have a faster processor for faster gaming".

Hilariously they seem to have done a complete U-turn in recent years, relying on multiple GPU single cards (and multiple cards) to stay in the running with AMD (7 series dual gpu card, 9800gx2 295 and so on).

Not only that, but in a completely hilarious twist of irony they are also pimping quad SLI. Something 3DFX were set to do in 2001. :D

Yet Nvidia said it was all rubbish, so surely it must have been? :D

I don't see how any of this is true. And I don't see how you could possibly write this unless you're in the business of spreading misinformation.

NVIDIA may have gone strong with multi-GPU cards like 9800GX2, GTX 295 and GTX 590, but it is hardly accurate to say they need multi-gpus to stay in the running against AMD when the 8800GTX, GTX 280/285 and GTX 480/580 have all been the faster GPUs in the same generation/class. On the contrary, it is AMD who have gone multi-GPU to beat NVIDIA's single GPU solutions, with the likes of the 4870x2 and then the 6990. And NVIDIA followed, rather than lead, with the GTX 295 and the GTX 590, respectively.

To say that the GTX 580 was NVIDIA's attempt to stay in the running against 7970 is nonsensical, unless you live in a blackhole and move around through time like we move through space now. The 580 is now end-of-life with no volume of new GPUs coming out of the foundry. The 580 came out well over an year ago and made no attempt to compete with the 7970. Kepler will be NVIDIA's "attempt to stay in the running".
 
Last edited:
IIRC Nvidia's whole marketing line was -

"Multiple GPU set ups are absolutely rubbish and require software support. Our GPUs have a faster processor for faster gaming".

Hilariously they seem to have done a complete U-turn in recent years, relying on multiple GPU single cards (and multiple cards) to stay in the running with AMD (7 series dual gpu card, 9800gx2 295 and so on).

Not only that, but in a completely hilarious twist of irony they are also pimping quad SLI. Something 3DFX were set to do in 2001. :D

Yet Nvidia said it was all rubbish, so surely it must have been? :D

To be fair 3DFX's early SLI implementation was only useful as you scaled up the resolution - if you were fillrate limited it was great - if the core of one graphics card wasn't enough to handle geometry transformation/setup, etc. then you were screwed - whereas what nVidia have for SLI helps with both.
 
I don't see how any of this is true. And I don't see how you could possibly write this unless you're in the business of spreading misinformation.

NVIDIA may have gone strong with multi-GPU cards like 9800GX2, GTX 295 and GTX 590, but it is hardly accurate to say they need multi-gpus to stay in the running against AMD when the 8800GTX, GTX 280/285 and GTX 480/580 have all been the faster GPUs in the same generation/class. On the contrary, it is AMD who have gone multi-GPU to beat NVIDIA's single GPU solutions, with the likes of the 4870x2 and then the 6990. And NVIDIA followed, rather than lead, with the GTX 295 and the GTX 590, respectively.

To say that the GTX 580 was NVIDIA's attempt to stay in the running against 7970 is nonsensical, unless you live in a blackhole and move around through time like we move through space now. The 580 is now end-of-life with no volume of new GPUs coming out of the foundry. The 580 came out well over an year ago and made no attempt to compete with the 7970. Kepler will be NVIDIA's "attempt to stay in the running".

+1
 
I don't see how any of this is true. And I don't see how you could possibly write this unless you're in the business of spreading misinformation.

Funny. Are you talking to me there or Nvidia? I remember the slogans very well thank you. I also remember Nvidia's vitriol and campaign of hate against 3DFX.

NVIDIA may have gone strong with multi-GPU cards like 9800GX2, GTX 295 and GTX 590, but it is hardly accurate to say they need multi-gpus to stay in the running against AMD when the 8800GTX, GTX 280/285 and GTX 480/580 have all been the faster GPUs in the same generation/class. On the contrary, it is AMD who have gone multi-GPU to beat NVIDIA's single GPU solutions, with the likes of the 4870x2 and then the 6990. And NVIDIA followed, rather than lead, with the GTX 295 and the GTX 590, respectively.

Why is it that absolutely EVERY SINGLE LAST TIME you reply to a post I have made you take it in the wrong context?

Sorry, this time I'm simply too tired to try and explain it to you. Learn how to read. You've simply repeated most of what I said there and then put a whole load of details on to what I had already said. You've agreed with me without even realising it. :rolleyes:

Had Nvidia not "followed" then for single card willy waving rights they would have been left in the dust and ATI would have had a serious 1up on them.

To say that the GTX 580 was NVIDIA's attempt to stay in the running against 7970 is nonsensical, unless you live in a blackhole and move around through time like we move through space now. The 580 is now end-of-life with no volume of new GPUs coming out of the foundry. The 580 came out well over an year ago and made no attempt to compete with the 7970. Kepler will be NVIDIA's "attempt to stay in the running".

Now tell me where I said that the GTX 580 was an attempt to stay in the running against the 7970?

Seriously, do you just make things up to argue about?

Yesterday it was you putting out a comment saying how Nvidia's 3D was far superior to passive 3D. I then tell you that you are wrong, and before long you've just started banging on about a load of nonsense.
 

Sorry man yeah you're right. If I use shutter 3D I throw up in less than five minutes.

I forgot how awesome it was to have a technology that makes me incredibly sick.

And yes, I am being serious. For some of us shutter flicker is very noticeable, ruling it out completely.

Any way going back to the discussion

http://www.bit-tech.net/hardware/graphics/2006/05/01/quad_sli_geforce_7900_gx2/13

It was actually Nvidia who decided to revive multiple GPU cards first. And the 7900GX2 in quad sli was a complete and total flop.
 
Sorry man yeah you're right. If I use shutter 3D I throw up in less than five minutes.

I forgot how awesome it was to have a technology that makes me incredibly sick.

And yes, I am being serious. For some of us shutter flicker is very noticeable, ruling it out completely.

Any way going back to the discussion

http://www.bit-tech.net/hardware/graphics/2006/05/01/quad_sli_geforce_7900_gx2/13

It was actually Nvidia who decided to revive multiple GPU cards first. And the 7900GX2 in quad sli was a complete and total flop.

ah, fair enough.

Gives me butterflies in the eyes. I kind of like the feeling :p

I found that passive doesn't give such a good 3D effect though :(
 
I see how you like taking potshots bringing a different topic to this thread. Well I'll play this once:
That was it. Before long you were accusing me of telling some one that he could use his 3Dvision kit with Tridef.




COAB you couldn't make this up.
Which you actually did say in so many posts, and we tried to explain to you that you can't -- which was crucial in the argument because Harpss1ngh had 3D vision kit +monitor which is why AMD was not an option for him. You were failing to realise that 3D Vision® is not just stereo 3D. It's a registered trademark of NVIDIA and it's the name used to describe NVIDIA's stereo 3D tech. The generic name is Stereo3D not 3D Vision®.

You also made statements like this that are patently false:
Tridef? probably, but you would be a bit silly because if you have Nvidia stuff then you probably have an Nvidia card.

All Nvidia's kit does is give you a transmitter, glasses, and driver that would work in the exact same way as Tridef.

3D is 3D. Nvidia's method is identical to AMD's, only AMD don't sell a kit.

http://www.tridef.com/home.html

Basically that and a monitor like the one I found would work on any GPU. Even something like a 3Dlabs or Tesla ETC.

To which I posted responses like this which are correct:

No it's not. There are a number of competing and incompatible standards. The 3D glasses on my TV don't work with the 3D glasses on my monitor (which is 3D VISION -- it's a brandname) and neither will work with Samsung. In fact my LG 3D Plasma's glasses won't work with the previous gen LG glasses/display -- and none of these work with ANY passive glasses and screens. NVIDIA 3D Vision 1.0 and 2.0 however, do interoperate, but 3D Vision 2 is better.

And all this, even after you had admitted to your ignorance of the topic at hand:
You can 3D with AMD cards. I'm not exactly sure how you do it as they're not very clear, but yes, you can 3D with AMD cards.



Anyway there were only two possibilities:
1) Either you realised that 3D Vision does not work with AMD but you were still telling him to get an AMD card -- which is VERY BAD advice (as it would render his existing gear useless) and you were simply there to give advice and sound smart on a topic you didn't know that much about
2) You didn't realise it doesn't work and were harping on about AMD's 3D (which really isn't AMD's 3D... It's just a bunch of third-party stuff that word work together anyway whether or not AMD's in the picture. Hell, you could get an NVIDIA card and still use that 3D.)

Either way, your argument looks bad. But I was there and I know the second case was the correct one. And I'm sure Harpsing can confirm it.

And anyone who cares to read every post in that thread could easily confirm it for themselves.
But then you changed your tune when you started to realised the ship you were on had a hole in its hull, and the rats were already abandoning it like someone let a big one rip.



Sorry man yeah you're right. If I use shutter 3D I throw up in less than five minutes.

I forgot how awesome it was to have a technology that makes me incredibly sick.

And yes, I am being serious. For some of us shutter flicker is very noticeable, ruling it out completely.

Any way going back to the discussion

http://www.bit-tech.net/hardware/graphics/2006/05/01/quad_sli_geforce_7900_gx2/13

It was actually Nvidia who decided to revive multiple GPU cards first. And the 7900GX2 in quad sli was a complete and total flop.

NVIDIA also supports passive 3D. NVIDIA's 3DTV Play software is cheaper than Tri-def and it works with third-party 3D solutions like an active or passive 3D display technology. Alternatively you can just use a passive or active 3D monitor/tv with its own tech such as iZ3D or with tri-def.

NVIDIA's solution simply comes in, and is relevant, where users want a superior ACTIVE 3D solution where you can expect a certain standard of quality.
 
No, it's just a case of you assuming I left it that way. It was only for testing.

The below quote does not read like "only for testing"

I used to have mine on a Zalman V3000F @ 755mhz linked shaders on stock volts.

a load of nonsense.

TBH this seems to be all you write ^^. I'm sorry but your posts are full of contradictions, constant backtracking, misinformed information and a bit of PJS thrown in for good measure :p
 
TBH this seems to be all you write ^^. I'm sorry but your posts are full of contradictions, constant backtracking, misinformed information and a bit of PJS thrown in for good measure :p

You just seem to like assuming things.

I've had my 7970 at 1000mhz. Note had.

For about two days.

If you don't like my posts then all I can say is you know? do the grown up thing and either don't read them at all or don't bother giving me the time of day.

If I run into people on the internet whom I seem to clash with I either pretend they're not there or just avoid them.

If you really were interested in why I overclocked the 470 it would be because I was writing a review of the Zalman cooler. If you go to OC3D and have a look through the member's reviews then you'll find it. I wrote a good few reviews on there, making sure that they contained information worth reading for every one.

If I simply attached a cooler and said "this cooler is awesome and nice and frosty" it wouldn't have appealed to many people would it?

I've done many things on there that you could consider contradictory. Like spending my own money on a pair of 3870x2 to debunk the things people say about Quadfire. So yes, I am a multi GPU hater. That doesn't mean I have never used multiple GPUs. I'd rather talk from experience than what I read. And I certainly don't believe everything I read either.
 
Back
Top Bottom